GTC DC ’25 Pregame – Chapter 3: AI Infrastructure Ecosystem

AI transcript
0:00:15 Hello, and welcome to a special GTC edition of the NVIDIA AI podcast.
0:00:20 This is the third of five episodes on the road to GTC Live in Washington, D.C.
0:00:23 Bonus conversations you won’t hear anywhere else.
0:00:27 We’ve gathered leaders from across the energy and infrastructure sectors
0:00:30 to discuss how they’re building the backbone of the AI economy,
0:00:34 the unseen network of data centers, power systems, and partners
0:00:36 that powers each and every breakthrough.
0:00:42 Enjoy the conversation, and when you’re done, visit ai-podcast.nvidia.com
0:00:48 to browse our library of over 275 episodes of the NVIDIA AI podcast.
0:00:50 AI isn’t just transforming tech.
0:00:54 It’s becoming a new driver of the American economy.
0:01:00 From manufacturing to medicine, it’s rebuilding our industrial and scientific foundations
0:01:04 that are driving new productivity and new jobs.
0:01:08 Nowhere is that impact clearer than in the rise of data centers.
0:01:11 They are the new factories of the AI age.
0:01:16 America’s AI leadership depends on more than algorithms and chips.
0:01:18 All those are awesome.
0:01:20 It runs on infrastructure.
0:01:25 Data centers, power grids, supply chains are becoming the backbone of intelligence,
0:01:30 the foundation of a national strategy for innovation.
0:01:34 Even when we assume we need, let’s say, 1,000x more,
0:01:37 and there’s forecasts that go up to a million times more,
0:01:40 the build-out is tremendous.
0:01:47 And here to talk about building that foundation is Gio Albertazzi, CEO of Vertiv,
0:01:51 Olivier Bloom, CEO Schneider Electric,
0:01:56 Krishna Gianalagagata, CTO GE Vernova,
0:02:01 and Chase Lockmiller, co-founder and CEO of Prusseau.
0:02:04 All right, Gio, let’s start off with you.
0:02:12 And let’s pretend for a second people don’t know how big of a challenge that we’re looking at right now.
0:02:20 Can you talk about the scope of the power and the cooling that we’re going to need by, let’s say, 2030?
0:02:29 Well, think in terms of all the announcements that we have heard about investment in data center,
0:02:33 certainly in the U.S., but globally.
0:02:39 We have an imagination about how much IT power is behind that.
0:02:48 Yeah, for every kilowatt, megawatt, gigawatt, there is an equivalent amount of power and thermal infrastructure,
0:02:53 and that power and thermal infrastructure needs building at scale.
0:02:59 Now, that has been traditionally a very kind of a construction, labor-intensive industry,
0:03:08 and we’re all thinking about how can we change that and scale it to absolutely unprecedented industrial proportions.
0:03:14 Because without that, and that’s what I think, I believe, and I know everyone in this panel and beyond are doing,
0:03:25 without that, we will never be able to scale AI at the speed that we have seen in the video and that you have explained to us.
0:03:31 So, I think we are in a very important inflection moment in this juncture.
0:03:37 So, Olivier, Schneider has been a hallmark inside of data centers.
0:03:43 I get a lot of data center tours, and I see your logo on a lot of the boxes in there.
0:03:51 One question I have for you is, sure, it’s build more and more and more, but there’s also an efficiency play.
0:03:57 Can you talk me through the balance of efficiency and just more power?
0:04:03 Yeah, you’re absolutely right. You know, what is very interesting, we are at a time where AI depends on compute, compute depends on energy.
0:04:08 The very interesting part is energy availability and efficiency depends on AI.
0:04:21 Why? Because, you know, at Schneider Electric, we have been an advocate of energy efficiency for many, many years, and we strongly believe that the combination of electrification, automation, and digitalization will solve the energy transition in every part of the world.
0:04:35 Now, 10 years ago, Frank speaking, it was not possible. Electrification, automation were available, but the type of technology you have now through AI, help you to make energy more efficient.
0:04:47 Providing the infrastructure to make the compute and therefore AI available, but we are even more excited to leverage AI to make our overall industry more efficient.
0:04:52 And if you think through, we have been all of us through the industrial and digital revolution.
0:04:59 Actually, what we are doing, it was in your introduction, we are talking more and more about AI factory, and we are implementing exactly the same principle.
0:05:12 So, going through the full life cycle of this AI factory, start from the design, building digital twin, where you can simulate power and cooling efficiency, to the development, the construction, the maintenance, the operation.
0:05:15 This whole life cycle is completely powered by AI at the end.
0:05:27 Krishna, I think that, you know, Vernova’s gas combustion engines are really a secret weapon in America’s AI build-out, but you’re sold out.
0:05:40 And, you know, talk to us a little bit just about how you’re changing your own business processes to drive more production, to get more online, and how you see that playing out over the course of the next few years.
0:05:41 Sure, Brad.
0:05:52 So, like you said, the production is, you know, the demand is massive, we are sold out through 28, maybe through 29, but we are investing heavily in capacity expansion of gas.
0:05:58 So, we are quadrupling our capacity of, you know, number of gas turbines delivered by 2028 compared to 2020.
0:06:04 And this is, all types of power is also an answer to this.
0:06:05 It’s not just gas.
0:06:09 So, we are fortunate to have basically all kinds of power that’s available to us.
0:06:15 This is gas, this is nuclear, this is solar, wind, and hydrogen.
0:06:22 So, we are looking to leverage every type of power source that’s available to grow the electrical infrastructure.
0:06:25 And power generation is a part of it.
0:06:28 A big part of it is also, we’ll talk about later, is the grid.
0:06:33 Once you actually generate these electrons, how do you get them from source to use?
0:06:37 In a very complex environment with new grid resources and so on.
0:06:44 But power generation is definitely, and grid is a big bottleneck right now, and we are working on, you know, easing that.
0:06:53 Help us understand how your power generation capabilities, capacity evolved from 2025 to, let’s call it, 2029.
0:07:02 So, if you look at the demand side, for about 20 years, the total power demand in the U.S. has been flat.
0:07:04 No growth, 0.3% growth.
0:07:08 But in the next 20 years, it’s expected to grow 50%.
0:07:11 And about a third of that is going to come from data centers.
0:07:17 And to meet the demand, you essentially need to leverage all kinds of power that’s available.
0:07:19 Like I said, it’s gas, it’s nuclear.
0:07:24 We have an SMR that’s coming online in 2029 in Ontario.
0:07:29 Our first SMR application at TVA went in this year.
0:07:33 So, there is significant growth in nuclear that’s coming up.
0:07:38 From a gas perspective, depending on the gas turbine, sometimes heavy duty is the answer.
0:07:39 Sometimes it’s your aeroderivative.
0:07:42 Sometimes it’s, you know, different types of turbines.
0:07:47 And also future-proofing some of these gas turbines so that they can run on hydrogen in the future.
0:07:50 So, it’s gas is a fantastic bridge right now.
0:07:53 So, electrify now, decarbonize later.
0:07:55 So, I think that’s something we can do as well.
0:07:59 And we’re also looking at, you know, wind and solar.
0:08:03 So, all of them belong in the energy equation.
0:08:05 And I think that’s what we’re working on.
0:08:07 It’s what the market needs.
0:08:10 Energy abundance is what we’re chasing here.
0:08:14 And energy abundance is really an all-of-the-above kind of answer right now.
0:08:15 Right.
0:08:21 You know, Chase, you guys have been at the forefront of what Jensen talks about, extreme co-design.
0:08:22 Right.
0:08:23 Moore’s law is dead.
0:08:26 The data center is now the unit of compute.
0:08:29 This is about power in and tokens out.
0:08:37 Crusoe is building some of the most capable data centers in the world, in Abilene and other places, in partnership with NVIDIA.
0:08:49 Talk to us a little bit about what you see happening in that world of extreme co-design and why you’ve been so successful, along with companies like CoreWeb, in pushing the frontier on how NVIDIA brings its chips to market.
0:08:51 Absolutely.
0:08:54 So, you know, Crusoe is an AI factory business.
0:09:02 And the thing about AI factories is this incredible composition of all of the greatest industries humanity has ever produced.
0:09:13 From, you know, everything the gentlemen next to me are working on to, you know, from a cooling, a power generation, a storage perspective, all the way to, you know, everything that we’ve done in the silicon space.
0:09:17 And, you know, most importantly, our partner with NVIDIA, our partnership with NVIDIA.
0:09:27 But, you know, this process of taking electrons and turning them into tokens is just one of humanity’s greatest opportunities and challenges over the next decade.
0:09:38 And we see it as, you know, just a very complex challenge in terms of just the scope and magnitude of these investments.
0:09:43 And it’s requiring just an incredible amount of resources across the entire economy.
0:09:58 So, tell me a little bit, I mean, again, I’m out of my league, out of my depth a little bit here, but the 800 volt data center, you know, kind of architecture that I know NVIDIA has been pushing as just one example of this, you know, extreme co-design.
0:10:08 What are the things that you see happening and coming down the pipeline that’s going to drive that equation in terms of power in and tokens out?
0:10:21 Yeah, so, you know, I think the 800 volt rack design is like a great example of us, you know, thinking through from a first principle basis, like, okay, data centers are sort of built in this way, and that’s how we built them 25 years ago.
0:10:25 Is that really how we should be doing it today at the scale of gigawatts?
0:10:26 Maybe not.
0:10:38 You know, when we think about this data center scale computer, where it’s not just a single server, it’s not just a single rack, it’s really the entire system working together as one cohesive unit that’s thinking as one giant brain.
0:10:41 And how do you think about that from a networking perspective?
0:10:42 How do you think about that from a cooling perspective?
0:10:49 How do you think about that from a, you know, from a, like, entire system perspective?
0:10:49 Yeah.
0:10:59 And, you know, the power aspect of that, you get massive efficiencies from doing things like, you know, changing, you know, changing from 48 volt to 800 volt.
0:11:02 Like, that’s like a huge efficiency gain that, like, all of us would benefit from.
0:11:12 And so, you know, it’s just one example of, like, this cohesive, collective, you know, industry-wide partnership that’s required to bring this infrastructure to life.
0:11:34 And if I can kind of reinforce 100 percent, you know, this 800 volt is an example of how you can make the whole system not only just more energy efficiency, but efficient, but really scoop up kind of a power that is available because the system can be redesigned from scratch.
0:11:39 Again, you were talking about the system, think system, not just components.
0:11:44 We’ve come from decades of data center will being built a component at a time.
0:11:49 I was talking about an inflection point in the way we think about data centers.
0:11:51 This is a fantastic example.
0:11:53 There are many examples.
0:12:08 Again, we have to think the power train, the thermal chain, the whole prefabrication as ways to industrialize the way we build data centers like we’ve never done before.
0:12:16 So, I believe the next 10 years, the next five years, three years will be very different in that respect than I have been so far.
0:12:21 We’ve seen some picture of what we call Vertiv One Core.
0:12:31 That is exactly that type of modularization system level, all parts integrally designed to work together.
0:12:36 And that is perfectly applicable to the 800-volt powertrain.
0:12:38 So, things are changing fast.
0:12:43 So, China is building faster.
0:12:44 They’re building bigger.
0:12:51 You had mentioned the 10X on nukes to R0 or 1, if you count Georgia.
0:12:53 And we’re behind.
0:13:03 I mean, I started, I did China, I worked with China in 1995 and on buying sheet metal and plastic injection molding.
0:13:07 And they sent me, I said, hey, send me a picture of the site.
0:13:10 And the picture had no roads, had no electricity.
0:13:13 And I said, well, how are we going to be up and running in 60 days?
0:13:15 They said, watch.
0:13:16 And it happened.
0:13:21 And China was willing to do whatever it took to build infrastructure.
0:13:25 And here we are, it’s 30 years later, and it has continued.
0:13:29 What is it that we could do to go faster?
0:13:34 I was at the, I think we were both at the winning of the AI race here in D.C.
0:13:37 Chase, you were here as well on stage.
0:13:43 What can Washington do to get this moving faster?
0:13:44 Start with you.
0:13:49 Well, look, when you see what happened in the past months, it’s been very clear for the local government
0:13:53 that AI has been big on the agenda and energy has been big on the agenda.
0:13:55 And the two are really interconnected.
0:13:59 So I think what they are doing right now to make sure, and it’s a bit what my colleagues have said,
0:14:02 it will take an ecosystem at the end of the day.
0:14:08 And you might not be aware, but whether it’s NVIDIA, us, the four of us,
0:14:13 you have more and more an ecosystem of people who are working together to support the local government to make it happen.
0:14:16 We are working very closely also with all the AI PR scaler.
0:14:20 And what is very, very important at the end of the day, and Joe talked about it,
0:14:24 we are living a huge revolution in our sector.
0:14:26 You know, I’ve been 32 years at Schneider Electric.
0:14:29 I think more has happened in the past two years than in the past 32.
0:14:34 So the fact that you have this ecosystem, which is working together, working closely with the government,
0:14:39 I think we have the good discussion, the real discussion on what does it take to make it happen.
0:14:41 And we are moving very, very fast.
0:14:46 The U.S., by the way, when it comes to AI and data center, is still by far number one in the world today.
0:14:52 Yeah, President Trump signed an executive order which basically said removing some of the barriers to that.
0:14:55 Are you currently experiencing that?
0:14:57 Yep. We are working on all of that for sure.
0:14:58 Good.
0:15:02 I mean, you nailed it.
0:15:09 I mean, Secretary Wright and Secretary Bergram, every time I’ve seen them, heard them, when we’re in D.C.,
0:15:12 they’re actually asking us, what is your recommendation?
0:15:13 How do we go faster?
0:15:14 What do we need to do?
0:15:23 I think it’s a complete change in kind of the mindset because now it’s been elevated to a national security issue.
0:15:23 That’s right.
0:15:23 Right.
0:15:26 This is no longer just about economic security.
0:15:31 It’s about national security and consequently moving, you know, faster than ever.
0:15:36 You know, which brings me to this question, where can startups help, right?
0:15:43 When you think about we’re sitting here with some of the largest industrial companies, you know, in the ecosystem.
0:15:50 What, if anything, do you see happening out there in the startup ecosystem, in Silicon Valley, in power generation?
0:15:59 Certainly, obviously, Crusoe and, you know, and the work of CoreWeave and others, Nebius, in terms of building out these new data centers has been critical.
0:16:03 But anything else that you see as interesting that people in the crowd might want to pay attention to?
0:16:05 Absolutely.
0:16:10 So, if you look at, I think startups are the lifeblood of our innovation ecosystem.
0:16:11 We are here.
0:16:14 And when you look at what we’re doing, I mean, G.
0:16:16 Varnova is 18 months old.
0:16:17 Spunoff is an independent company.
0:16:27 And our CEO intentionally headquartered it in Cambridge, Massachusetts, because of the innovation ecosystem with MIT and the universities and everything surrounding there.
0:16:35 And we are investors in a lot of these startups, but startups are bringing, so when I look, let’s go to the grid, for example.
0:16:45 Our grid was designed for, you know, 50 years ago, relatively stable, spinning loads, spinning sources of power, relatively stable loads.
0:16:52 It’s a completely different world right now with the AI workloads, hundreds of megawatts in milliseconds.
0:16:59 And then on the source side, you have wind and solar, where these resources, you know, they’re changing constantly.
0:16:59 They’re fluctuating.
0:17:05 So, having a grid that can actually deal with this is really, really important.
0:17:06 So, we are working it.
0:17:08 All our competitors are working it.
0:17:12 But the startups have a really, really important role to play there.
0:17:15 There’s the hardware aspect of it and also the software aspect of it.
0:17:19 When it comes to hardware, we have power electronics, innovative solutions.
0:17:23 I’ve seen companies that are cooling the transmission cables.
0:17:27 I mean, just fascinating stuff, you know, so that you can increase the capacity to transmit power.
0:17:36 But how do we leverage power electronics, you know, statcoms, you know, fax solutions to make sure we can leverage the infrastructure more?
0:17:39 Startups are playing a huge role out there.
0:17:43 And they’re also playing a very massive role in the software ecosystem.
0:17:47 How do we make, you know, how do we predict, you know, what’s going to happen?
0:17:52 And how do we make sure the right resources are there at the right time, at the right place?
0:17:57 And how do we manage in real time from millisecond scale to minutes and hour scale?
0:18:02 How do we dynamically manage the grid across the country?
0:18:05 These are all big, challenging problems.
0:18:06 We are working it.
0:18:09 But a lot of the innovation in that space is coming from startups.
0:18:15 So we are seeing, again, I don’t want to name a specific startup, but we are seeing lots of innovation coming from there.
0:18:22 So I’m really excited to work with the startups and see how we can kind of, you know, accelerate this transition.
0:18:24 And maybe the same question for the two of you.
0:18:32 Yeah, maybe if I can, if I come back to what I said before, you know, AI depends on compute, compute depends on energy, and actually energy depends on energy intelligence.
0:18:34 What I mean by that, at the end of the day, it’s about data.
0:18:42 You know, even if we are, Schneider Electric, historically more hardware company, actually today we are more an hardware and in the digital company.
0:18:53 Back to your question, what we love to do with startup, the more you have startups which are coming on this data intelligence and looking at the different assets you can have in the AI factory, because it’s a very complex one.
0:18:55 You will have switchgear, you will have racks.
0:19:02 We are doing a lot of R&D, but the more they are taking those user case on how you can make it more efficient, how you can capture the data.
0:19:04 We are helping to structure the data.
0:19:09 We are helping to create an ecosystem, but our customers are asking for even more.
0:19:18 So the more you have startups coming in our own ecosystem, which is completely open, by the way, not protected, the more we deliver solutions and the more we can accelerate energy intelligence in our industry.
0:19:21 So we love having those startups taking a strong interest.
0:19:26 We need more and more smart people and capital, you know, coming from all of those startups.
0:19:27 Yeah, absolutely.
0:19:29 A hundred percent.
0:19:40 And if I look at it from the infrastructure technology, the industry, infrastructure, data center, infrastructure technology has been started for many years.
0:19:47 And then it started to accelerate crazy in the last three, five years, and it is going to accelerate more.
0:19:54 When you have this kind of acceleration, people like us, that scale.
0:19:59 I mean, scale is absolutely essential, but you cannot just scale in this market.
0:20:01 You have to be an innovator and scale.
0:20:04 Innovation happens organically.
0:20:07 Innovation happens inorganically.
0:20:18 There needs to be a base of startups, inventors, creative minds, engineers out there that define what the new technology will be.
0:20:25 Because a new technology cannot be just nurtured within the G-Vernovo, Schneider, or Vertiv.
0:20:25 Yeah.
0:20:35 It needs to come also from spring, from the, let’s say, entrepreneurial fabric of a country.
0:20:39 And that the two things combine and scale.
0:20:44 And that’s what we see very auspicious times for right now.
0:20:49 Chase, where are the most exciting areas you see innovation?
0:20:55 When you look ahead two, three years, you say, I can’t wait to bring this into, right, the AI factory.
0:20:56 Yeah.
0:21:08 You know, just adding to this sense of urgency, and I think this huge opportunity for startups is, you know, just this sense of, you know, startups have, you know, the benefit of being able to move very, very quickly.
0:21:15 And, you know, we’re at a moment of just incredible urgency across the world in terms of bringing this infrastructure to life.
0:21:31 You know, I think from, to your question around things that, you know, we’re really, really excited about, you know, what’s changing in the data center is, you know, it’s unbelievable how much change is happening over such a short period of time.
0:21:42 You know, from, you know, from, you know, the cooling architectures to the power densities of the systems, you know, I think when, if you look 20 years ago, a single rack in a data center might have been two to four kilowatts.
0:21:50 You know, today with the, you know, GB200s, you know, you’re looking at 130-ish, 140 kilowatts, depending on peak demand.
0:21:57 You know, and then you look, you play it forward and you look at, you know, Vera Rubin, you look at Feynman, these are, you know, one megawatt racks, right?
0:22:02 You know, there’s a thousand homes worth of power in a single rack in a data center.
0:22:12 That requires tremendous innovation across, you know, the cooling systems, the network systems in terms of cabling these things together, which we’re seeing, you know, a lot of tremendous innovation.
0:22:20 And, you know, I think, I think there’s a lot to be said about just like the overall memory optimization in the systems.
0:22:32 Like, you know, one thing Crusoe is very focused on is not just the hardware aspect of bringing these AI factories to life, but actually the software aspects of actually how do you operate the AI factory to produce intelligence very, very efficiently.
0:22:45 That requires getting data and moving data from, you know, things like, you know, your object storage, you know, bigger storage platform to directly into HBM and how do you do that, you know, ultra efficiently.
0:23:01 And we’re really excited about some of the innovations taking place across the networking stack and the overall software innovations that are, you know, bringing data to compute way more efficiently so we can actually utilize the GPUs and get the most intelligence per unit of investment in compute.
0:23:05 So, you know, you know, it may be appropriate because we’re here at GTC.
0:23:09 NVIDIA has been an active investor across the space.
0:23:23 They’re forward engineering, working with partners like you guys to help drive this, you know, this extreme co-design in the data center to get us the benefits of Moore’s Law now that Moore’s Law is dead or even more.
0:23:36 Can each of you just give us an example of, you know, some of the things you’re doing with NVIDIA in particular that are helping to drive this flywheel of innovation and how this unique partnership might be accelerating the path that we’re on?
0:23:40 We can start exactly.
0:23:44 We were talking about the 800 volt DC as an example.
0:23:49 And that’s an area where we are actively, actively working together.
0:23:56 So, how to, how will we make that technology available?
0:24:10 And what we have to do as infrastructure provider, we have to make that infrastructure ready ahead of the GPUs, the silicon that will land because that infrastructure needs to be in place.
0:24:11 So, that’s an area.
0:24:19 Another area of great importance is still going to continue to evolve is everything around, around liquid cooling.
0:24:29 Let’s not forget the transition that has occurred and is still occurring in the industry from air cool to a, to a liquid cool.
0:24:32 And that’s a, it’s a colossal change, if you will.
0:24:33 There’s still air.
0:24:33 It’s a mix.
0:24:38 It’s a, it’s an hybrid, but more and more loads are liquid cooled.
0:24:49 So, all these things require technology partners that are ahead of the pack and then the industry can, can follow.
0:24:51 But we have to pave those ways together.
0:24:52 Yeah.
0:24:56 We have a ton of deep partnerships with NVIDIA across the stack.
0:25:05 And it’s, it’s, it’s so amazing that this company just spans so many aspects of the economy from power generation to, you know, the most cutting edge artificial intelligence development.
0:25:10 But, you know, we’re working with NVIDIA very tightly on like AI factory reference design.
0:25:21 So, how do you actually build these gigawatt scale data centers and, and do it efficiently from a, from a power generation, from a, from a cooling, from a, you know, a power perspective.
0:25:25 And so that, that’s one big aspect that we’re investing a lot of time with NVIDIA.
0:25:27 Another is on the software layer, right?
0:25:42 So, you know, working with them on, you know, what the, you know, things like Lepton, which is basically a distribution of like getting compute in people’s hands so that they can run their workloads very efficiently across, you know, various NeoCloud partners like ourselves.
0:25:59 As well as things like Dynamo, which is basically accelerating inference and, and creating, how do you actually get more utilization and a lot of these memory optimizations that Crusoe is focused on pushing and innovating on so that people actually get more tokens per, per unit of compute from their systems.
0:26:05 Let me, let me, let me just interrupt by asking, do you see any other chip company out innovating, right?
0:26:14 There are a lot of other chip companies you guys work with other ones, but NVIDIA seems to be playing a really unique role in pushing the ecosystem forward.
0:26:19 Maybe, maybe a little commentary on how it’s similar or different than others.
0:26:19 Yeah.
0:26:26 The thing I would say about NVIDIA that they’ve done so incredibly well and kudos to Jensen, it’s just the ecosystem itself, right?
0:26:32 NVIDIA, you know, I forget the exact headcount, it’s 35,000 people, something like that.
0:26:40 You know, the amount of leverage that they’re able to get from those 35,000 people, and then the ecosystem of builders is absolutely incredible.
0:26:56 And that’s, you know, that’s kudos to Jensen in terms of, you know, building out all these different work streams and workflows across energy, data center design, all these different SDKs that they’ve released that have just helped unlock this collective intelligence of society.
0:27:01 And, and, and, and, and the startup ecosystem to, you know, big, uh, big enterprises.
0:27:04 Just to, you know, kind of build on that a little bit.
0:27:10 By the time you’re done with all four of us, maybe you’ll get a view of the scope and breadth of the ecosystem that NVIDIA is playing.
0:27:13 We talked about software optimization, AI.
0:27:14 We talked about cooling.
0:27:16 We talked about 800-volt rack.
0:27:19 We are more on the power generation and grid side.
0:27:21 So we just released a white paper with NVIDIA.
0:27:27 We’re working with NVIDIA on reference architectures for power gen inside the, and, and how do you manage it?
0:27:33 So we just talked about the grid a few minutes ago, how complex it is, different source of power, load balancing, and so on.
0:27:39 Now the power is not available today at the grid sometimes when folks like you are building the data centers.
0:27:41 So there’s behind the meter power, there’s bridging power.
0:27:47 But when you do that, you’re dealing with all the complexities of the grid inside the data center.
0:27:58 So how do you design and architect it so that you don’t have, you can actually manage the power flow with battery energy storage, with, you know, power electronics, facts, you know, all kinds of different systems.
0:28:04 Working with people who were in the medium voltage and low voltage space that, you know, my colleagues here are doing.
0:28:14 So it’s an ecosystem where all the way from the power gen to memory to algorithms, cooling, and the entire thing, we’re all working with NVIDIA.
0:28:21 So to me, it’s amazing to see how they’re curating the entire ecosystem and moving us all forward, helping us move forward.
0:28:25 And maybe we’ll end with, you know, Schneider’s commentary on this.
0:28:31 I don’t see anything else out there that has this level of systems thinking, right?
0:28:33 It’s extraordinary, the breadth.
0:28:47 It gives me a lot of encouragement about the gains yet to become or yet to come in terms of America’s re-industrialization around power and AI, you know, generation.
0:28:50 But how is Schneider working with NVIDIA to drive this forward?
0:28:52 Well, look, a lot has been said.
0:28:57 But for me, what is very unique in the vision of Jensen is, first of all, to work with partners.
0:29:01 You know, what we love with the NVIDIA team, they share a lot with us.
0:29:06 You know, they speak about the Omniverse, they speak about the next generation of chips, you know, the next generation of GPU.
0:29:12 And when you can work as a technology partner, upfront and in a very open manner, that changes the game.
0:29:18 What I would say also, you know, Jensen has this famous sentence about the speed of light, you know.
0:29:24 He’s forcing the entire industry, but in a very positive way, to work at a very different speed.
0:29:30 And for us, for Schneider, I think I can tell you, it has changed a lot of suffering in the way we are managing the company.
0:29:34 Joe mentioned the shift from air cooling to liquid cooling.
0:29:38 We discovered that by working, we knew it would come.
0:29:39 The question is, when?
0:29:43 We discovered two to three years ago that it could come much faster than expected.
0:29:45 But if you ask me today, what does it change?
0:29:49 We have to completely rethink the way we design those data centers.
0:29:55 And like what Chase mentioned in his sentence, he said, design, he talked about maintenance, operation.
0:30:05 So you have to completely rethink the way you build those AI factories, from the design stage, you know, to the build stage, to the operate, to the maintenance.
0:30:07 So it’s about building a digital twin.
0:30:11 And what Jensen and the NVIDIA team are forcing us to do is to think about those next generation.
0:30:12 And guess what?
0:30:14 Software is playing a big role.
0:30:15 And software is powered by AI.
0:30:18 So again, it’s a loop on which we are working all together.
0:30:29 One thing I want to touch on, follow up there, just in terms of the system-wide innovation needed to invent solutions to support this infrastructure at scale.
0:30:42 One of the conversations I was having backstage with Olivier is just around this notion of how the utilization of the data center scale computer impacts the power swings.
0:30:53 Because you have this massive, large variance in terms of, like when you have a gigawatt scale data center that’s operating as a single brain, you have this problem where it’s pushing data to the GPUs.
0:30:58 All of the GPUs are running their back propagation or their compute, you know, their matrix multiplications.
0:31:01 And then they’re publishing the results to every other GPU on the cluster.
0:31:09 And when you have that, you actually have these massive power swings that, you know, it’s like whatever, six hertz, somewhere in that range where it’s, you have these massive load oscillations.
0:31:14 And that creates a lot of problems for utilities and on-site power generation.
0:31:21 Like, you know, GE Vernova doesn’t like it when we have these massive power swings in the actual power utilization of the turbines.
0:31:29 So it’s actually created this, you know, system-wide partnership where we need to work with, you know, our on-site generation, we need to work with the utilities.
0:31:48 And most importantly, we need to work with, you know, folks like Schneider to innovate these solutions where we can actually create a buffer through battery systems that can absorb a lot of this load oscillation that we’re seeing from these large-scale training workloads and these system-wide workloads that are running on these massive-scale computers.
0:32:00 I mean, one of the things that blows me away, if we think about the Manhattan Project, that cost about $4 billion over four years, right?
0:32:02 Inflation adjusts that, that’s about $40 billion.
0:32:07 As a percentage of GDP, it was 1% of GDP, so let’s call it $400 billion.
0:32:12 Jensen has said we’re going to spend $4 trillion over the next five years.
0:32:18 So private industry is going to spend 10x what we spent on the Manhattan Project.
0:32:37 And just listening to the decentralized coordination, really where NVIDIA plays this role in pulling everybody together and pushing everybody forward, it reminds me, like, the speed of that is very commensurate with what we think about the speed of, you know, the Manhattan Project.
0:32:39 I don’t see it happening anywhere else in the world.
0:32:45 I think that all of your companies are critical to the success, you know, of the American AI race.
0:32:48 And it’s great to have NVIDIA helping push it all forward.
0:32:50 I know we got a jump, so I’m going to throw it to you, Patrick.
0:32:51 No, that was a great ending.
0:32:54 Gentlemen, I just want to thank you so much for coming up here.
0:33:00 And I hope everybody out there listening has a better understanding of the challenge that is ahead of us.
0:33:07 And also, Washington can hopefully clear as many roadblocks as possible to let all of you cook.
0:33:09 Thank you very much.
0:33:10 Thank you.
0:33:42 Thank you.
0:33:44 Thank you.

Bonus coverage from the NVIDIA GTC DC ’25 Pregame Show

Chapter 3: AI Infrastructure Ecosystem

Behind every breakthrough is an unseen network of data centers, power systems, and partners. Leaders across energy and infrastructure discuss how they’re building the backbone of the AI economy.

Catch up with GTC DC on-demand: ⁠https://www.nvidia.com/en-us/on-demand/⁠

Leave a Reply

The AI PodcastThe AI Podcast
0
Let's Evolve Together
Logo