AI transcript
0:00:10 [MUSIC]
0:00:14 Hello, and welcome to the NVIDIA AI podcast.
0:00:16 I’m your host, Noah Krebs.
0:00:20 The emergence of generative AI into our collective consciousness has led to
0:00:25 increase scrutiny around how AI works, particularly when it comes to energy consumption.
0:00:29 The interest and focus on how much energy AI actually uses is, of course,
0:00:33 important, our planet faces energy-related challenges ranging from
0:00:36 grid infrastructure needs to the impact of climate change.
0:00:42 But AI and accelerated computing have a big part to play in helping to solve these challenges
0:00:46 and others related to sustainability and energy efficiency.
0:00:49 Joining us today to talk about all of this is Joshua Parker,
0:00:52 the Senior Director of Corporate Sustainability at NVIDIA.
0:00:57 Josh brings a wealth of experience as a sustainability professional and engineer.
0:01:02 Before his current role, he led Western Digital’s Corporate Sustainability function
0:01:05 and managed ethics and compliance across the Asia-Pacific region.
0:01:09 At NVIDIA, Josh is at the forefront of driving sustainable practices
0:01:15 and leveraging AI to enhance energy efficiency and reduce environmental impact.
0:01:19 Josh, welcome and thanks so much for taking the time to join the AI podcast.
0:01:21 Thanks, Noah. Long time listener for some caller.
0:01:28 Love it. I always dreamt of hosting an AM radio call-in show, so you’re inspiring me.
0:01:33 So I’m going to just kind of open this up broadly to you to get us started.
0:01:38 I kind of alluded a little bit to it in the intro, but computing uses energy.
0:01:43 Everybody is talking about AI, obviously, and there’s, you know, with good reason,
0:01:47 interest, scrutiny, focus on, well, how much energy is AI using?
0:01:52 And if we start using more and more AI going forward, what’s the impact going to be on,
0:01:57 you know, all of these energy-related things that we deal with on our planet?
0:02:00 So let me ask you to start. How much energy does it really use?
0:02:03 Is this a warranted discussion? What are the things that we should be
0:02:11 thinking about and talking about and working on when it comes to energy and sustainability,
0:02:16 and not just AI, but accelerated computing and the other advanced technology that goes with it?
0:02:22 It’s definitely a reasonable question. And as someone who’s been in sustainability for a while,
0:02:28 it’s something that we always talk about, climate and energy and emissions.
0:02:32 Those are very big, urgent topics that we’re all thinking about in every context.
0:02:38 So when you see something like AI that really bursts onto the scene, especially so rapidly,
0:02:42 it’s a very legitimate question to ask, okay, what is this going to do to energy?
0:02:46 And what is this going to do to emissions associated with that energy?
0:02:48 So it’s the right question to ask.
0:02:52 The answer, though, turns out to be pretty complicated, because number one,
0:02:57 we’re in a period of rapid, rapid growth, and it’s hard to predict where we’re going to be
0:03:03 in just a couple of years in terms of the expansion of AI. Where is it going to be used?
0:03:06 How is it going to be used? What benefits do we get from it?
0:03:12 And there are lots of nuances to that as well, including things like the hardware that it’s
0:03:18 being built on. This accelerated computing platform itself is rapidly, rapidly evolving
0:03:24 in ways that actually support sustainability. So it’s the energy efficiency gains that are
0:03:29 being developed in that accelerated computing platform are really, really dramatic.
0:03:35 So if you want to paint a really accurate picture, as accurate as we can get in terms of where we’re
0:03:41 going with AI energy consumption and the emissions associated with that, you need to have a really
0:03:48 complex, nuanced analysis to avoid coming to very inaccurate and potentially alarming conclusions.
0:03:53 So let’s dig into that a little bit within the context of a half-hour podcast.
0:03:56 Let’s talk about some of those nuances, and you mentioned the hardware,
0:04:03 and so obviously GPUs, a big part of that. How is accelerated computing sustainable?
0:04:11 Accelerated computing is a very tailored form of computing for the type of work required for AI.
0:04:18 The accelerated computing platform on which modern AI is built takes the math that was
0:04:26 previously being done on CPUs in a sequential order and basically uses these very, very efficient,
0:04:33 very purpose-built GPUs to do them in parallel. So you do many, many more operations,
0:04:40 and these GPUs are optimized to do that math, the matrix math that’s required for AI,
0:04:45 really, really effectively and efficiently, and that’s what’s driven both the huge gains in
0:04:52 performance and also the huge gains in efficiency in AI, and it’s really what has enabled AI to boom
0:04:59 the way it has. The traditional CPU paradigm, CPU-only paradigm for trying to run this math
0:05:06 just wasn’t scaling, and so we really need GPUs to unlock this exponential growth really in
0:05:14 performance and efficiency. So if you compare CPU-only systems to accelerated computing systems,
0:05:22 which have a mix of GPUs and CPUs, we’re seeing roughly a 20 times improvement in energy efficiency
0:05:29 between CPU-only and accelerated computing platforms, and that’s across a mix of workloads.
0:05:35 So it’s a very dramatic improvement in efficiency, and if you look just over time at accelerated
0:05:40 computing itself, so compare accelerated computing platforms from a few years ago to ones that we
0:05:49 have today, that change in efficiency is even more dramatic. So just eight years ago, if you compare
0:05:57 the energy efficiency for AI inference from eight years ago until today, we’re 45,000 times more
0:06:02 energy efficient for that inference step of AI that where you’re actually engaging with the models,
0:06:11 right? And it’s really hard to understand that type of figure. One 45,000 of the energy required
0:06:16 just eight years ago is what we’re using today. So building in that type of energy efficiency gain
0:06:21 into your models for how much energy will we be using for AI in a couple of years is really,
0:06:27 really critical because it’s such a dramatic change. Yeah, that’s a huge number. I don’t mean
0:06:33 this as a joke, but the best way I can think of to ask it is, are the workloads now 45,000 times
0:06:40 bigger or more energy intensive than they were eight years ago, or is really the efficiency
0:06:47 outpacing all of this new attention on AI? So that ends up being a very complex question as well
0:06:53 because you have to get into the realm of figuring out how many times do we need to train a model,
0:06:59 and then versus how many times can I reuse it with inference? So you know, big models like
0:07:04 Claude 3.5, ChatGPT40 and so forth, they’re trained, takes a lot of time to train them,
0:07:09 but the inferencing when you’re actually engaging with the model, if it ends up being durable,
0:07:16 then that inference step is very, very efficient. So it’s because we’re still in this inflection
0:07:23 point where things are moving very rapidly, it’s hard to see how the compute requirements are scaling
0:07:28 versus the energy efficiency. Certainly, they’re scaling. We continue to see bigger and bigger
0:07:35 models being used and trained because companies are seeing huge benefits in doing that. But yeah,
0:07:39 this is what makes it complicated is that the energy efficiency is ramping up very dramatically
0:07:46 at the same time. Right. So along those lines, there’s been attention, well, there’s been attention
0:07:52 on the stability and durability of power grids, national, regional, local, as long as they’ve
0:07:58 existed, but certainly over the past five years, 10 years or so. But since AI has come into the
0:08:05 public consciousness, there have been news stories and what have you about kind of the localized
0:08:12 effects of, oh, this data center was built in wherever it was and it had this huge impact on
0:08:16 the local power grid or people are concerned it might. Can you talk a little bit about the
0:08:22 common concerns around AI’s energy consumption, particularly when it comes to the impact on
0:08:29 local power grids, whether it’s in the area where a data center might be or other places
0:08:33 where people are concerned that AI is impacting the local energy situation?
0:08:38 The first thing to look at when you’re trying to put this in context and figure out what the
0:08:45 local constraints might be on the grid is the fact that AI still accounts for a tiny, tiny fraction
0:08:50 of overall energy consumption generally. If you look at it globally first and then we’ll
0:08:56 get to the local issue, look at it globally. The International Energy Agency estimates that
0:09:03 all of data centers, so not just AI, all of data centers account for about 2% of global energy
0:09:10 consumption and AI is a small fraction of that 2% so far. So we’re looking at much less than 1%
0:09:16 of total energy consumption currently used for AI-focused data centers. Now that is absolutely
0:09:23 growing, we expect that to grow, but ultimately this is a very small piece of the pie still compared
0:09:28 to everything else that we’re looking at. The second thing to consider is the fact that AI
0:09:34 is mobile, especially when you think about AI training. So when you’re working to train these
0:09:39 models for months at a time, potentially very large models, you need a lot of compute power,
0:09:44 that training doesn’t have to happen near the internet edge, it doesn’t have to happen
0:09:50 in a particular location. So there is more mobility built in to how AI works than in
0:09:55 traditional data centers because you could potentially train your model in Siberia if it
0:10:03 were more efficient to do that or in Norway or Dubai. Wherever you have access to reliable,
0:10:08 clean energy, it would be easy to do your training there. And some companies, including
0:10:13 some of our partners, have built business models around that, locating AI data centers
0:10:19 and accelerated computing data centers close to where there is excess energy and renewable energy.
0:10:26 So to get back to your original question, will we see local constraints and problems with the grid?
0:10:32 I think for the most part, we’re able to avoid that because of those issues. AI is still relatively
0:10:39 small and the companies who are deploying large data centers already know where there is excess
0:10:45 energy, where there are potentially constraints. And of course, they’re trying to find locations for
0:10:50 the AI data centers where it’s not going to become a problem and they’re going to be able to have
0:10:58 good access to clean, reliable energy. So is the sort of pessimistic or it sounds like overblown,
0:11:05 if data centers only comprise 2% of global energy usage and AI-specific data centers are
0:11:13 only 1%, is the sort of proliferance of pessimistic stories around AI’s impact on the energy grid?
0:11:18 Is that just kind of the dark side of a hype cycle that we’re used to and this is how it’s
0:11:23 coming up with AI? I don’t want to say that those concerns are misplaced. Certainly, if you’re living
0:11:28 in a community and you see an AI data center going up, you may have questions about what
0:11:34 it’s going to do to your local grid. And we are because we’re in this period of very, very rapid
0:11:40 and to some extent unexpected deployment of AI because ChantGPT really took the world by
0:11:46 storm and by surprise two years ago. There is some churn right now, which you would expect
0:11:52 when you have a new technology, a new industrial revolution that’s bursting on the world.
0:11:58 There’s going to be a little bit of time where resources are not perfectly allocated.
0:12:03 But what we’re seeing is we’re already working through that phase and the companies who are
0:12:10 deploying the big AI data centers are finding ways to do that that are sustainable and that won’t
0:12:15 threaten local grids. Even if in the near term, there are some constraints that we all need to
0:12:20 work through. In the long term, even in the medium term, we’re very optimistic that these are
0:12:25 solvable issues. Sort of to look at the bright side of things relative to AI,
0:12:31 as with so many industries and so many problems that people are trying to solve in all walks of
0:12:39 life, AI can be a help when it comes to optimizing grids and energy use and even perhaps trying to
0:12:43 solve some of these climate challenges that we’re all facing. Can you talk a little bit about that
0:12:51 and about how AI can or perhaps already is making a positive impact on our energy situation?
0:12:59 Sure. There are two examples that I’ll focus on that speak to different aspects of sustainability.
0:13:08 The first one is helping us adapt and mitigate the worst impacts of climate change. AI and
0:13:14 accelerated computing in general are game changers when it comes to weather and climate
0:13:21 modeling and simulation. NVIDIA has a platform called Earth 2 and we partner closely with
0:13:26 national labs and non-governmental organizations and other organizations to develop
0:13:35 systems where we can much, much more accurately forecast weather, model weather, and help mitigate
0:13:41 the worst impacts of near-term weather and also look longer-term at climate so that we’ve got
0:13:46 a better understanding of where we’re going with climate and can better prepare for and plan for
0:13:51 that. The other piece of the puzzle is that accelerated computing and AI, both of them,
0:13:59 have real-world applications that directly reduce energy and emissions. One example of that is
0:14:08 we’ve transitioned the PANDAS library in Python. It’s one of the most well-used libraries for
0:14:15 simulation and for high-performance computing. We’ve taken that, basically, and written libraries
0:14:21 that will translate that code, code that’s written for that library, onto the accelerated computing
0:14:29 GPU platform. And doing that, we’ve basically opened up for the world of researchers a way to
0:14:37 run their simulations in that library without any code changes at 150 times speed and many,
0:14:43 many times more energy efficiently. So, the application of this itself is going to end up
0:14:48 reducing energy consumption and also reducing the associated emissions. Right. Now, it sounds
0:14:54 like a virtuous cycle. So, to kind of dig into that for a second, people familiar with NVIDIA,
0:14:58 longtime listeners to the podcast, perhaps, understand that NVIDIA is not just a hardware
0:15:05 company. It’s hardware. It’s software. It’s all of the tools and everything in the stack to leverage
0:15:09 the GPUs in all of these different systems, accelerated computing, AI, and what have you.
0:15:14 Can you talk a little bit more, you mentioned earlier, some of the efficiency gains,
0:15:20 but a little bit more about some of the efficiency improvements in AI training and inference,
0:15:26 and then also NVIDIA’s role in developing more efficient models. You mentioned Earth 2 just a
0:15:32 second ago, but some of NVIDIA’s other work in increasing the overall efficiency of the hardware,
0:15:39 the software, and then these models themselves. Sure. If you start with inference, one data point
0:15:46 that I’d like to share is that just in one generation of improvement, so if you look at
0:15:52 one generation of NVIDIA hardware, our ampere platform, or sorry, our hopper platform, which is
0:15:58 the one that we’re shipping in the highest volume right now, and you compare that to the Blackwell
0:16:04 platform, which we’re releasing next. We’ll come out within the next several months. The Blackwell
0:16:10 platform is 25 times more energy efficient or AI inference than hopper was. Just in the space of
0:16:18 one change, one generation of NVIDIA hardware and software, it’s using 1/25 of the energy. That’s a
0:16:25 96% reduction in energy use. There are performance gains associated as well.
0:16:31 That’s right. Yeah, significant performance gains. I understated it, but performance gains are
0:16:39 amazing, but that’s incredible. A 96% efficiency gain while also getting these enormous next
0:16:45 generation performance gains as well. That’s right. You asked a little bit about how we’re
0:16:53 getting there. That 25x improvement is through innovation in many spaces. It includes things like
0:17:00 quantization, where we’re using lower precision math, basically finding ways to optimize the
0:17:06 model on training and inference in ways that allow us to do it even more in parallel and do it more
0:17:13 efficiently. It includes things like water cooling our data centers, so that we’re using less water
0:17:17 and significantly less energy to keep the data centers cool. Of course, it includes things like
0:17:25 better GPU design. All of these levers we’re pulling at the same time to drive those energy
0:17:31 efficiency improvements. We expect that to continue because energy efficiency is something that we
0:17:36 care about and our customers care about. It really helps enable more performance when we’re able to
0:17:41 take out waste and to be able to do things more efficiently. It enhances our ability to drive more
0:17:47 performance and to make the AI even more valuable than it was. Our guest today is Joshua Parker.
0:17:53 Josh is the Senior Director of Corporate Sustainability at NVIDIA. We’ve been talking a little bit
0:17:59 about energy and energy in the AI area, climate change, sustainability, all these incredibly
0:18:05 important things that form the basis of our ability to live on Earth and how these things are being
0:18:11 affected by all of these rapid advances in technology and obviously being fueled by the
0:18:16 interest in AI, which AI has been around for a while as you well know listening to this show,
0:18:20 but over the past couple of years, it’s really ramped up in intensity. Josh, you mentioned just
0:18:27 a second ago, customers. Maybe we can dig in a little bit to some case studies, customer examples,
0:18:34 real-world applications of AI improving energy efficiency. Sure. One that I love to talk about
0:18:41 is with a partner of ours called Wistron. It’s a Taiwan-based electronics company. It has a lot
0:18:46 of manufacturing that many people may not have heard of, but it’s a large sophisticated company.
0:18:53 They took our Omniverse platform, which is a 3D modeling platform, and they modeled
0:19:02 one of their buildings in Omniverse, and then they used AI to run emulations on that digital twin
0:19:08 that they created in our Omniverse platform. We’re looking for ways to improve energy efficiency.
0:19:15 After doing that, after using that digital twin, applying AI to run some emulations, they were
0:19:21 able to increase the energy efficiency of that facility by 10%, which is a dramatic change,
0:19:27 just based on a digital twin. In this case, it resulted in savings of 120,000 kilowatt hours per
0:19:34 year. Fantastic. A word that I hear a lot that I’m not really sure what it means. I’ve got a
0:19:41 working understanding is decarbonization. My understanding is that NVIDIA has been involved
0:19:47 in some work optimizing processes for decarbonization in industry. I think you know a little bit
0:19:51 about that, and I think it’s relevant to what we’re talking about. Could you dig into that a
0:19:59 little bit as well? Decarbonization is a really broad term, and it makes sense to have a nuanced
0:20:04 appreciation for everything that encompasses. Basically, I think best understanding is that
0:20:11 describes our efforts to reduce greenhouse gas emissions, and an effort to try to mitigate
0:20:17 the climate change that we’re seeing. It can apply to a lot of things, including things like
0:20:22 carbon capture and storage, where we’re actually pulling carbon out of processes or out of the
0:20:27 atmosphere and finding ways to store it. That’s an area actually where we have some
0:20:35 good work being done and some partnerships with NVIDIA and Shell, for example, where we’re finding
0:20:44 ways to use AI to greatly enhance carbon capture and storage technologies. That’s one example.
0:20:51 Another example of decarbonization, and this goes directly to emissions, is we’ve also partnered
0:20:59 with California, I believe in the San Diego area, to help firefighters there use AI to monitor
0:21:06 weather risks that could lead to wildfires. In doing so, we’ve been able to expand their
0:21:13 responsiveness, to improve the responsiveness of their firefighting efforts, and not only to
0:21:19 potentially save lives and property, but also to significantly reduce emissions associated with
0:21:25 those wildfires. That’s another example of decarbonization that we’re seeing. Then the third
0:21:31 example I’ll give is NVIDIA itself. We are trying to decarbonize our own operations
0:21:38 by transitioning the energy that we use from standard energy to renewable. This year, we’re
0:21:45 going to be 100% renewable for our own operations, so we’re very excited to be transitioning over
0:21:50 there. Quick show note to listeners if you’re interested. We did an episode previously about
0:21:55 the use of AI in firefighting and combating wildfires in California. I would imagine it’s
0:21:59 the same organization. Forgive me, it might not be, but definitely worth a listen. It’s a great
0:22:07 episode. Josh, you mentioned a little while ago, data centers. We talked about being a citizen and
0:22:11 seeing a data center come up. Of course, it’s good to have questions and concerns about how
0:22:16 is that going to impact things. The design of the data centers themselves obviously plays a big
0:22:23 part in how efficiently they do or don’t operate. You mentioned a little bit earlier water cooling
0:22:31 as a technique that’s been effective in reducing or increasing energy efficiency, I should say.
0:22:38 Can you talk a little bit more about the data center, how data centers relate to sustainability
0:22:43 broadly and some of the innovations that have helped in that regard? Yes. The first thing you
0:22:48 get to mention is to put data centers in context because it’s easy to think about data centers
0:22:54 being really impactful in terms of sustainability. You see them, they’re large, you hear about them
0:22:59 using all this energy, all this water, and so forth. It’s a legitimate question to ask,
0:23:05 but ultimately, again, IEA estimates that all of data centers only account for 2%
0:23:11 of global energy consumption right now. It’s much, much smaller than most of the other centers.
0:23:17 Not to interrupt you, but I’ve obviously been working in this arena for a while now,
0:23:22 but that figure really blew my mind. I was expecting something a little bit bigger than 2%.
0:23:28 That makes sense because there’s so much attention on this. AI is very much in the zeitgeist right
0:23:35 now. We’re talking about it. We see the rapid expansion. That’s one of the things where it’s
0:23:42 important to put it in context. The innovation in data center design is one of those levers that
0:23:49 I mentioned that we’re all pursuing to try to improve energy efficiency. As we’re transitioning
0:23:55 to this new generation of products at NVIDIA to Blackwell, our reference design, our recommended
0:24:02 design for the data centers for our new V200 chip is focused all on direct-to-chip liquid cooling,
0:24:09 which is much more efficient, really unlocks better energy efficiency, of course, but also
0:24:15 unlocks better performance because the cooling is more effective. We’re able to run the chips
0:24:21 more optimally in ways that lead to better performance as well as to better energy efficiency.
0:24:28 Paint the picture. Sorry to interrupt you again. When you talk about direct-to-chip cooling,
0:24:32 what is that replacing? What’s the thing that it’s more efficient than?
0:24:39 That’s in comparison to air cooling, where you have heat sinks and you’re using air flow. Direct-to-chip
0:24:45 liquid cooling, we’re able to get liquid in closer to the silicon and to more effectively get heat
0:24:52 away from that. One of the reasons why this is so effective and helpful with accelerated computing
0:24:58 is that the compute density is so high. If you look at a modern AI data center and you see a rack,
0:25:05 for example, of a modern AI data center, there’s as much compute power in that rack as there was
0:25:12 in several racks, many racks of a traditional computing data center. The compute density is so
0:25:18 high that it makes more sense to invest in the cooling because you’re getting so much more compute
0:25:23 for that same single direct-to-chip cooling element that you’re using.
0:25:30 Obviously, all of this is a work in progress, so to speak, that advances in AI aren’t slowing down
0:25:37 anytime soon. You’re talking about the increases in efficiency and the increases in performance
0:25:44 and all that from generation to generation. As you said, this is early days of the world
0:25:50 leveraging AI to put it that way. As we like to do on the podcast as we move towards wrapping up,
0:25:56 let’s talk about the future a little bit. Not to put you on the spot to make crystal ball predictions,
0:26:03 but what are some of the things that are being explored as future directions for AI and energy
0:26:10 efficiency and really supporting this growing energy demand, not just from AI and data centers,
0:26:18 as you rightfully pointed out. Just in general, how is AI being explored to meet future energy
0:26:25 demands across the globe? There are many ways, and most of them are yet to be discovered and talked
0:26:33 about because we’re in such early days that it’s hard to know what opportunities are yet in front
0:26:41 of us. Some examples are, for example, grid update. Updates to the grid in ways that will enable
0:26:46 more renewable energy. When you have more and more renewable energy coming online,
0:26:53 that is more cyclical than traditional energy. If you’re burning coal 24/7, it’s a steady stream
0:27:00 of energy. If you’ve got wind or solar, it’s more variable. Also, with things like residential solar,
0:27:07 you have times when you may be wanting to allow those residential solar panels to send energy
0:27:13 onto the grid instead of pulling data for the house off the grid. All of those types of things
0:27:19 benefit from modernization and modernization in a way where AI can play a significant role in
0:27:26 helping to avoid waste and to create ways for the energy flow to be optimized. That’s one very near
0:27:34 term area where we’re seeing progress. A partner of our Portland General Electric is using AI and
0:27:39 using some of our products to do just that, to put smart meters around to help them manage the
0:27:46 growth in renewable energy. There certainly is a perfect opportunity right now for us to do this,
0:27:52 to engage in grid modernization because we have so much value to potentially unlock with AI.
0:27:58 We’ve got these big companies who are really good at developing infrastructure who are motivated
0:28:03 to help us modernize the grid and introduce more renewable energy and do that in a responsible
0:28:11 way. It’s a perfect time for us to be focusing on this. There are also fantastic other sustainability
0:28:18 related benefits from AI in terms of drug discovery for human welfare and materials
0:28:24 discovery for things like electric vehicle batteries and batteries more generally. We’ve
0:28:30 heard reports from Microsoft as well as from Google about discoveries they’ve made in material
0:28:35 science that could potentially lead to much more efficient batteries in the future, which of course
0:28:41 would not only save resources but also save energy as well. Right. It’s funny listening you talk about
0:28:46 sustainability and it’s obviously related but it makes me think about or when you mentioned about
0:28:52 residential solar and being able to send solar power back to the grid and thinking about the
0:28:58 virtuous implications of that. It made me think about recycling for whatever reason and recycling
0:29:04 being one of those things that individually if we all do it it’s forms of collective and then
0:29:11 obviously if we’re moving from residential to industry and large corporations and factories
0:29:18 and what have you the importance of recycling is sort of bigger obviously in these spots in a
0:29:24 factory than in an individual house. I’m wondering about the importance of collective action when it
0:29:30 comes to all of these things that you’ve been talking about with AI and sustainability and energy
0:29:38 demand and then sort of dovetailing from that. What about the role of industry or even of governments
0:29:45 in driving these kind of new and emerging best practices that will support sustainability for
0:29:52 all of us? I think there’s a great role for governments policy makers to play here in terms
0:29:59 of number one setting an example of how accelerating computing and AI can be used as tools for good
0:30:04 and we’re seeing some great work by regulators especially in the United States but also in
0:30:10 Europe and elsewhere where there’s a real appreciation of the potential benefits to society
0:30:17 and to sustainability and adopting both accelerated computing and AI and using that for the public
0:30:23 good. So I think that’s the first way in which I think there’s an opportunity here accelerating
0:30:28 all these workloads transitioning them over to a sustainable platform and then using AI to try
0:30:35 to benefit society in an environmental way and in a social way as well and then also to
0:30:42 encourage that type of sustainable deployment in industry as well and make sure that we’re
0:30:49 again modernizing the grid and creating the environment where we have a clear path towards
0:30:56 sustainable deployment of AI because ultimately you know this is moving very very quickly. This
0:31:01 fourth industrial revolution is we’re living through it. It’s very exciting and we don’t want
0:31:08 anything to undermine our ability to capture the benefit from this so that we can try to mitigate
0:31:14 climate change. We can develop new drugs. We can see all of the potential benefits from sustainability
0:31:19 and we can have that at the same time with sustainable deployment of AI data centers if we’re careful.
0:31:25 Absolutely. Josh, before we wrap I just want to mention the NVIDIA Deep Learning Institute is
0:31:31 actually sponsoring this episode. I just want to give them a shout out and listeners out there who
0:31:38 are interested obviously in AI but in AI and sustainability in particular head over to the
0:31:44 NVIDIA Deep Learning Institute. There are DLI courses on AI and sustainability. You can learn
0:31:49 so much more about what we’ve been talking about and go out and make an impact of your own which
0:31:55 we obviously encourage everybody to do. Josh, kind of as we wrap up here and along those lines
0:32:01 for listeners who would like to learn more about the work that you’re leading, the work that NVIDIA
0:32:06 is doing on sustainability, energy efficiency, everything we’ve been talking about, maybe even
0:32:11 some of the work that NVIDIA is doing with partners along these fronts. Where is a good place
0:32:17 for a listener to go online to kind of start digging in deeper to this? We are publishing more
0:32:23 and more content about the connection between auxiliary computing and AI and sustainability
0:32:31 on our website. We have a sustainable computing sub-page on our public web page. We have some
0:32:36 corporate blogs there, some white papers and so forth. Those were all really interesting and
0:32:42 very readable I think in terms of giving you examples of how NVIDIA and our partners actually
0:32:47 are doing so much good work in sustainability. We also of course publish an annual sustainability
0:32:53 report if you’re interested in the corporate level view, what we’re doing in terms of energy
0:32:57 efficiency in our systems and our own corporate commitments that’s in our annual sustainability
0:33:02 report which is on our website as well. Fantastic. Closing thoughts, anything you want to leave
0:33:08 the listeners to take with them or ponder when it comes to the future of AI, the future of
0:33:15 sustainability, made better by AI, anything else we’ve touched on? I just offer a healthy dose
0:33:22 of optimism. We’ve heard, we’ve heard I think an unhealthy dose of pessimism and skepticism about
0:33:29 AI specifically in the realm of sustainability and again those are all legitimate questions but
0:33:36 AI I currently believe is going to be the best tool that we’ve ever seen to help us achieve
0:33:42 more sustainability and more sustainable outcomes. If we capture this, if we capture the moment and
0:33:49 use AI for good and if we use this new auxiliary computing platform to drive better efficiencies
0:33:54 then we’re going to see really dramatic and positive results over time as we do that more and
0:33:59 more. That’s what we want and let’s end on the optimistic note. Josh, thank you so much for
0:34:04 taking the time to come on and talk with us and it goes without saying but all the best of luck
0:34:08 to you and your teams on the work you’re doing. It could be more important. Thanks Noah, really
0:34:19 appreciate it.
0:34:30 [Music]
0:34:41 [Music]
0:34:59 [Music]
0:35:09 [BLANK_AUDIO]
0:00:14 Hello, and welcome to the NVIDIA AI podcast.
0:00:16 I’m your host, Noah Krebs.
0:00:20 The emergence of generative AI into our collective consciousness has led to
0:00:25 increase scrutiny around how AI works, particularly when it comes to energy consumption.
0:00:29 The interest and focus on how much energy AI actually uses is, of course,
0:00:33 important, our planet faces energy-related challenges ranging from
0:00:36 grid infrastructure needs to the impact of climate change.
0:00:42 But AI and accelerated computing have a big part to play in helping to solve these challenges
0:00:46 and others related to sustainability and energy efficiency.
0:00:49 Joining us today to talk about all of this is Joshua Parker,
0:00:52 the Senior Director of Corporate Sustainability at NVIDIA.
0:00:57 Josh brings a wealth of experience as a sustainability professional and engineer.
0:01:02 Before his current role, he led Western Digital’s Corporate Sustainability function
0:01:05 and managed ethics and compliance across the Asia-Pacific region.
0:01:09 At NVIDIA, Josh is at the forefront of driving sustainable practices
0:01:15 and leveraging AI to enhance energy efficiency and reduce environmental impact.
0:01:19 Josh, welcome and thanks so much for taking the time to join the AI podcast.
0:01:21 Thanks, Noah. Long time listener for some caller.
0:01:28 Love it. I always dreamt of hosting an AM radio call-in show, so you’re inspiring me.
0:01:33 So I’m going to just kind of open this up broadly to you to get us started.
0:01:38 I kind of alluded a little bit to it in the intro, but computing uses energy.
0:01:43 Everybody is talking about AI, obviously, and there’s, you know, with good reason,
0:01:47 interest, scrutiny, focus on, well, how much energy is AI using?
0:01:52 And if we start using more and more AI going forward, what’s the impact going to be on,
0:01:57 you know, all of these energy-related things that we deal with on our planet?
0:02:00 So let me ask you to start. How much energy does it really use?
0:02:03 Is this a warranted discussion? What are the things that we should be
0:02:11 thinking about and talking about and working on when it comes to energy and sustainability,
0:02:16 and not just AI, but accelerated computing and the other advanced technology that goes with it?
0:02:22 It’s definitely a reasonable question. And as someone who’s been in sustainability for a while,
0:02:28 it’s something that we always talk about, climate and energy and emissions.
0:02:32 Those are very big, urgent topics that we’re all thinking about in every context.
0:02:38 So when you see something like AI that really bursts onto the scene, especially so rapidly,
0:02:42 it’s a very legitimate question to ask, okay, what is this going to do to energy?
0:02:46 And what is this going to do to emissions associated with that energy?
0:02:48 So it’s the right question to ask.
0:02:52 The answer, though, turns out to be pretty complicated, because number one,
0:02:57 we’re in a period of rapid, rapid growth, and it’s hard to predict where we’re going to be
0:03:03 in just a couple of years in terms of the expansion of AI. Where is it going to be used?
0:03:06 How is it going to be used? What benefits do we get from it?
0:03:12 And there are lots of nuances to that as well, including things like the hardware that it’s
0:03:18 being built on. This accelerated computing platform itself is rapidly, rapidly evolving
0:03:24 in ways that actually support sustainability. So it’s the energy efficiency gains that are
0:03:29 being developed in that accelerated computing platform are really, really dramatic.
0:03:35 So if you want to paint a really accurate picture, as accurate as we can get in terms of where we’re
0:03:41 going with AI energy consumption and the emissions associated with that, you need to have a really
0:03:48 complex, nuanced analysis to avoid coming to very inaccurate and potentially alarming conclusions.
0:03:53 So let’s dig into that a little bit within the context of a half-hour podcast.
0:03:56 Let’s talk about some of those nuances, and you mentioned the hardware,
0:04:03 and so obviously GPUs, a big part of that. How is accelerated computing sustainable?
0:04:11 Accelerated computing is a very tailored form of computing for the type of work required for AI.
0:04:18 The accelerated computing platform on which modern AI is built takes the math that was
0:04:26 previously being done on CPUs in a sequential order and basically uses these very, very efficient,
0:04:33 very purpose-built GPUs to do them in parallel. So you do many, many more operations,
0:04:40 and these GPUs are optimized to do that math, the matrix math that’s required for AI,
0:04:45 really, really effectively and efficiently, and that’s what’s driven both the huge gains in
0:04:52 performance and also the huge gains in efficiency in AI, and it’s really what has enabled AI to boom
0:04:59 the way it has. The traditional CPU paradigm, CPU-only paradigm for trying to run this math
0:05:06 just wasn’t scaling, and so we really need GPUs to unlock this exponential growth really in
0:05:14 performance and efficiency. So if you compare CPU-only systems to accelerated computing systems,
0:05:22 which have a mix of GPUs and CPUs, we’re seeing roughly a 20 times improvement in energy efficiency
0:05:29 between CPU-only and accelerated computing platforms, and that’s across a mix of workloads.
0:05:35 So it’s a very dramatic improvement in efficiency, and if you look just over time at accelerated
0:05:40 computing itself, so compare accelerated computing platforms from a few years ago to ones that we
0:05:49 have today, that change in efficiency is even more dramatic. So just eight years ago, if you compare
0:05:57 the energy efficiency for AI inference from eight years ago until today, we’re 45,000 times more
0:06:02 energy efficient for that inference step of AI that where you’re actually engaging with the models,
0:06:11 right? And it’s really hard to understand that type of figure. One 45,000 of the energy required
0:06:16 just eight years ago is what we’re using today. So building in that type of energy efficiency gain
0:06:21 into your models for how much energy will we be using for AI in a couple of years is really,
0:06:27 really critical because it’s such a dramatic change. Yeah, that’s a huge number. I don’t mean
0:06:33 this as a joke, but the best way I can think of to ask it is, are the workloads now 45,000 times
0:06:40 bigger or more energy intensive than they were eight years ago, or is really the efficiency
0:06:47 outpacing all of this new attention on AI? So that ends up being a very complex question as well
0:06:53 because you have to get into the realm of figuring out how many times do we need to train a model,
0:06:59 and then versus how many times can I reuse it with inference? So you know, big models like
0:07:04 Claude 3.5, ChatGPT40 and so forth, they’re trained, takes a lot of time to train them,
0:07:09 but the inferencing when you’re actually engaging with the model, if it ends up being durable,
0:07:16 then that inference step is very, very efficient. So it’s because we’re still in this inflection
0:07:23 point where things are moving very rapidly, it’s hard to see how the compute requirements are scaling
0:07:28 versus the energy efficiency. Certainly, they’re scaling. We continue to see bigger and bigger
0:07:35 models being used and trained because companies are seeing huge benefits in doing that. But yeah,
0:07:39 this is what makes it complicated is that the energy efficiency is ramping up very dramatically
0:07:46 at the same time. Right. So along those lines, there’s been attention, well, there’s been attention
0:07:52 on the stability and durability of power grids, national, regional, local, as long as they’ve
0:07:58 existed, but certainly over the past five years, 10 years or so. But since AI has come into the
0:08:05 public consciousness, there have been news stories and what have you about kind of the localized
0:08:12 effects of, oh, this data center was built in wherever it was and it had this huge impact on
0:08:16 the local power grid or people are concerned it might. Can you talk a little bit about the
0:08:22 common concerns around AI’s energy consumption, particularly when it comes to the impact on
0:08:29 local power grids, whether it’s in the area where a data center might be or other places
0:08:33 where people are concerned that AI is impacting the local energy situation?
0:08:38 The first thing to look at when you’re trying to put this in context and figure out what the
0:08:45 local constraints might be on the grid is the fact that AI still accounts for a tiny, tiny fraction
0:08:50 of overall energy consumption generally. If you look at it globally first and then we’ll
0:08:56 get to the local issue, look at it globally. The International Energy Agency estimates that
0:09:03 all of data centers, so not just AI, all of data centers account for about 2% of global energy
0:09:10 consumption and AI is a small fraction of that 2% so far. So we’re looking at much less than 1%
0:09:16 of total energy consumption currently used for AI-focused data centers. Now that is absolutely
0:09:23 growing, we expect that to grow, but ultimately this is a very small piece of the pie still compared
0:09:28 to everything else that we’re looking at. The second thing to consider is the fact that AI
0:09:34 is mobile, especially when you think about AI training. So when you’re working to train these
0:09:39 models for months at a time, potentially very large models, you need a lot of compute power,
0:09:44 that training doesn’t have to happen near the internet edge, it doesn’t have to happen
0:09:50 in a particular location. So there is more mobility built in to how AI works than in
0:09:55 traditional data centers because you could potentially train your model in Siberia if it
0:10:03 were more efficient to do that or in Norway or Dubai. Wherever you have access to reliable,
0:10:08 clean energy, it would be easy to do your training there. And some companies, including
0:10:13 some of our partners, have built business models around that, locating AI data centers
0:10:19 and accelerated computing data centers close to where there is excess energy and renewable energy.
0:10:26 So to get back to your original question, will we see local constraints and problems with the grid?
0:10:32 I think for the most part, we’re able to avoid that because of those issues. AI is still relatively
0:10:39 small and the companies who are deploying large data centers already know where there is excess
0:10:45 energy, where there are potentially constraints. And of course, they’re trying to find locations for
0:10:50 the AI data centers where it’s not going to become a problem and they’re going to be able to have
0:10:58 good access to clean, reliable energy. So is the sort of pessimistic or it sounds like overblown,
0:11:05 if data centers only comprise 2% of global energy usage and AI-specific data centers are
0:11:13 only 1%, is the sort of proliferance of pessimistic stories around AI’s impact on the energy grid?
0:11:18 Is that just kind of the dark side of a hype cycle that we’re used to and this is how it’s
0:11:23 coming up with AI? I don’t want to say that those concerns are misplaced. Certainly, if you’re living
0:11:28 in a community and you see an AI data center going up, you may have questions about what
0:11:34 it’s going to do to your local grid. And we are because we’re in this period of very, very rapid
0:11:40 and to some extent unexpected deployment of AI because ChantGPT really took the world by
0:11:46 storm and by surprise two years ago. There is some churn right now, which you would expect
0:11:52 when you have a new technology, a new industrial revolution that’s bursting on the world.
0:11:58 There’s going to be a little bit of time where resources are not perfectly allocated.
0:12:03 But what we’re seeing is we’re already working through that phase and the companies who are
0:12:10 deploying the big AI data centers are finding ways to do that that are sustainable and that won’t
0:12:15 threaten local grids. Even if in the near term, there are some constraints that we all need to
0:12:20 work through. In the long term, even in the medium term, we’re very optimistic that these are
0:12:25 solvable issues. Sort of to look at the bright side of things relative to AI,
0:12:31 as with so many industries and so many problems that people are trying to solve in all walks of
0:12:39 life, AI can be a help when it comes to optimizing grids and energy use and even perhaps trying to
0:12:43 solve some of these climate challenges that we’re all facing. Can you talk a little bit about that
0:12:51 and about how AI can or perhaps already is making a positive impact on our energy situation?
0:12:59 Sure. There are two examples that I’ll focus on that speak to different aspects of sustainability.
0:13:08 The first one is helping us adapt and mitigate the worst impacts of climate change. AI and
0:13:14 accelerated computing in general are game changers when it comes to weather and climate
0:13:21 modeling and simulation. NVIDIA has a platform called Earth 2 and we partner closely with
0:13:26 national labs and non-governmental organizations and other organizations to develop
0:13:35 systems where we can much, much more accurately forecast weather, model weather, and help mitigate
0:13:41 the worst impacts of near-term weather and also look longer-term at climate so that we’ve got
0:13:46 a better understanding of where we’re going with climate and can better prepare for and plan for
0:13:51 that. The other piece of the puzzle is that accelerated computing and AI, both of them,
0:13:59 have real-world applications that directly reduce energy and emissions. One example of that is
0:14:08 we’ve transitioned the PANDAS library in Python. It’s one of the most well-used libraries for
0:14:15 simulation and for high-performance computing. We’ve taken that, basically, and written libraries
0:14:21 that will translate that code, code that’s written for that library, onto the accelerated computing
0:14:29 GPU platform. And doing that, we’ve basically opened up for the world of researchers a way to
0:14:37 run their simulations in that library without any code changes at 150 times speed and many,
0:14:43 many times more energy efficiently. So, the application of this itself is going to end up
0:14:48 reducing energy consumption and also reducing the associated emissions. Right. Now, it sounds
0:14:54 like a virtuous cycle. So, to kind of dig into that for a second, people familiar with NVIDIA,
0:14:58 longtime listeners to the podcast, perhaps, understand that NVIDIA is not just a hardware
0:15:05 company. It’s hardware. It’s software. It’s all of the tools and everything in the stack to leverage
0:15:09 the GPUs in all of these different systems, accelerated computing, AI, and what have you.
0:15:14 Can you talk a little bit more, you mentioned earlier, some of the efficiency gains,
0:15:20 but a little bit more about some of the efficiency improvements in AI training and inference,
0:15:26 and then also NVIDIA’s role in developing more efficient models. You mentioned Earth 2 just a
0:15:32 second ago, but some of NVIDIA’s other work in increasing the overall efficiency of the hardware,
0:15:39 the software, and then these models themselves. Sure. If you start with inference, one data point
0:15:46 that I’d like to share is that just in one generation of improvement, so if you look at
0:15:52 one generation of NVIDIA hardware, our ampere platform, or sorry, our hopper platform, which is
0:15:58 the one that we’re shipping in the highest volume right now, and you compare that to the Blackwell
0:16:04 platform, which we’re releasing next. We’ll come out within the next several months. The Blackwell
0:16:10 platform is 25 times more energy efficient or AI inference than hopper was. Just in the space of
0:16:18 one change, one generation of NVIDIA hardware and software, it’s using 1/25 of the energy. That’s a
0:16:25 96% reduction in energy use. There are performance gains associated as well.
0:16:31 That’s right. Yeah, significant performance gains. I understated it, but performance gains are
0:16:39 amazing, but that’s incredible. A 96% efficiency gain while also getting these enormous next
0:16:45 generation performance gains as well. That’s right. You asked a little bit about how we’re
0:16:53 getting there. That 25x improvement is through innovation in many spaces. It includes things like
0:17:00 quantization, where we’re using lower precision math, basically finding ways to optimize the
0:17:06 model on training and inference in ways that allow us to do it even more in parallel and do it more
0:17:13 efficiently. It includes things like water cooling our data centers, so that we’re using less water
0:17:17 and significantly less energy to keep the data centers cool. Of course, it includes things like
0:17:25 better GPU design. All of these levers we’re pulling at the same time to drive those energy
0:17:31 efficiency improvements. We expect that to continue because energy efficiency is something that we
0:17:36 care about and our customers care about. It really helps enable more performance when we’re able to
0:17:41 take out waste and to be able to do things more efficiently. It enhances our ability to drive more
0:17:47 performance and to make the AI even more valuable than it was. Our guest today is Joshua Parker.
0:17:53 Josh is the Senior Director of Corporate Sustainability at NVIDIA. We’ve been talking a little bit
0:17:59 about energy and energy in the AI area, climate change, sustainability, all these incredibly
0:18:05 important things that form the basis of our ability to live on Earth and how these things are being
0:18:11 affected by all of these rapid advances in technology and obviously being fueled by the
0:18:16 interest in AI, which AI has been around for a while as you well know listening to this show,
0:18:20 but over the past couple of years, it’s really ramped up in intensity. Josh, you mentioned just
0:18:27 a second ago, customers. Maybe we can dig in a little bit to some case studies, customer examples,
0:18:34 real-world applications of AI improving energy efficiency. Sure. One that I love to talk about
0:18:41 is with a partner of ours called Wistron. It’s a Taiwan-based electronics company. It has a lot
0:18:46 of manufacturing that many people may not have heard of, but it’s a large sophisticated company.
0:18:53 They took our Omniverse platform, which is a 3D modeling platform, and they modeled
0:19:02 one of their buildings in Omniverse, and then they used AI to run emulations on that digital twin
0:19:08 that they created in our Omniverse platform. We’re looking for ways to improve energy efficiency.
0:19:15 After doing that, after using that digital twin, applying AI to run some emulations, they were
0:19:21 able to increase the energy efficiency of that facility by 10%, which is a dramatic change,
0:19:27 just based on a digital twin. In this case, it resulted in savings of 120,000 kilowatt hours per
0:19:34 year. Fantastic. A word that I hear a lot that I’m not really sure what it means. I’ve got a
0:19:41 working understanding is decarbonization. My understanding is that NVIDIA has been involved
0:19:47 in some work optimizing processes for decarbonization in industry. I think you know a little bit
0:19:51 about that, and I think it’s relevant to what we’re talking about. Could you dig into that a
0:19:59 little bit as well? Decarbonization is a really broad term, and it makes sense to have a nuanced
0:20:04 appreciation for everything that encompasses. Basically, I think best understanding is that
0:20:11 describes our efforts to reduce greenhouse gas emissions, and an effort to try to mitigate
0:20:17 the climate change that we’re seeing. It can apply to a lot of things, including things like
0:20:22 carbon capture and storage, where we’re actually pulling carbon out of processes or out of the
0:20:27 atmosphere and finding ways to store it. That’s an area actually where we have some
0:20:35 good work being done and some partnerships with NVIDIA and Shell, for example, where we’re finding
0:20:44 ways to use AI to greatly enhance carbon capture and storage technologies. That’s one example.
0:20:51 Another example of decarbonization, and this goes directly to emissions, is we’ve also partnered
0:20:59 with California, I believe in the San Diego area, to help firefighters there use AI to monitor
0:21:06 weather risks that could lead to wildfires. In doing so, we’ve been able to expand their
0:21:13 responsiveness, to improve the responsiveness of their firefighting efforts, and not only to
0:21:19 potentially save lives and property, but also to significantly reduce emissions associated with
0:21:25 those wildfires. That’s another example of decarbonization that we’re seeing. Then the third
0:21:31 example I’ll give is NVIDIA itself. We are trying to decarbonize our own operations
0:21:38 by transitioning the energy that we use from standard energy to renewable. This year, we’re
0:21:45 going to be 100% renewable for our own operations, so we’re very excited to be transitioning over
0:21:50 there. Quick show note to listeners if you’re interested. We did an episode previously about
0:21:55 the use of AI in firefighting and combating wildfires in California. I would imagine it’s
0:21:59 the same organization. Forgive me, it might not be, but definitely worth a listen. It’s a great
0:22:07 episode. Josh, you mentioned a little while ago, data centers. We talked about being a citizen and
0:22:11 seeing a data center come up. Of course, it’s good to have questions and concerns about how
0:22:16 is that going to impact things. The design of the data centers themselves obviously plays a big
0:22:23 part in how efficiently they do or don’t operate. You mentioned a little bit earlier water cooling
0:22:31 as a technique that’s been effective in reducing or increasing energy efficiency, I should say.
0:22:38 Can you talk a little bit more about the data center, how data centers relate to sustainability
0:22:43 broadly and some of the innovations that have helped in that regard? Yes. The first thing you
0:22:48 get to mention is to put data centers in context because it’s easy to think about data centers
0:22:54 being really impactful in terms of sustainability. You see them, they’re large, you hear about them
0:22:59 using all this energy, all this water, and so forth. It’s a legitimate question to ask,
0:23:05 but ultimately, again, IEA estimates that all of data centers only account for 2%
0:23:11 of global energy consumption right now. It’s much, much smaller than most of the other centers.
0:23:17 Not to interrupt you, but I’ve obviously been working in this arena for a while now,
0:23:22 but that figure really blew my mind. I was expecting something a little bit bigger than 2%.
0:23:28 That makes sense because there’s so much attention on this. AI is very much in the zeitgeist right
0:23:35 now. We’re talking about it. We see the rapid expansion. That’s one of the things where it’s
0:23:42 important to put it in context. The innovation in data center design is one of those levers that
0:23:49 I mentioned that we’re all pursuing to try to improve energy efficiency. As we’re transitioning
0:23:55 to this new generation of products at NVIDIA to Blackwell, our reference design, our recommended
0:24:02 design for the data centers for our new V200 chip is focused all on direct-to-chip liquid cooling,
0:24:09 which is much more efficient, really unlocks better energy efficiency, of course, but also
0:24:15 unlocks better performance because the cooling is more effective. We’re able to run the chips
0:24:21 more optimally in ways that lead to better performance as well as to better energy efficiency.
0:24:28 Paint the picture. Sorry to interrupt you again. When you talk about direct-to-chip cooling,
0:24:32 what is that replacing? What’s the thing that it’s more efficient than?
0:24:39 That’s in comparison to air cooling, where you have heat sinks and you’re using air flow. Direct-to-chip
0:24:45 liquid cooling, we’re able to get liquid in closer to the silicon and to more effectively get heat
0:24:52 away from that. One of the reasons why this is so effective and helpful with accelerated computing
0:24:58 is that the compute density is so high. If you look at a modern AI data center and you see a rack,
0:25:05 for example, of a modern AI data center, there’s as much compute power in that rack as there was
0:25:12 in several racks, many racks of a traditional computing data center. The compute density is so
0:25:18 high that it makes more sense to invest in the cooling because you’re getting so much more compute
0:25:23 for that same single direct-to-chip cooling element that you’re using.
0:25:30 Obviously, all of this is a work in progress, so to speak, that advances in AI aren’t slowing down
0:25:37 anytime soon. You’re talking about the increases in efficiency and the increases in performance
0:25:44 and all that from generation to generation. As you said, this is early days of the world
0:25:50 leveraging AI to put it that way. As we like to do on the podcast as we move towards wrapping up,
0:25:56 let’s talk about the future a little bit. Not to put you on the spot to make crystal ball predictions,
0:26:03 but what are some of the things that are being explored as future directions for AI and energy
0:26:10 efficiency and really supporting this growing energy demand, not just from AI and data centers,
0:26:18 as you rightfully pointed out. Just in general, how is AI being explored to meet future energy
0:26:25 demands across the globe? There are many ways, and most of them are yet to be discovered and talked
0:26:33 about because we’re in such early days that it’s hard to know what opportunities are yet in front
0:26:41 of us. Some examples are, for example, grid update. Updates to the grid in ways that will enable
0:26:46 more renewable energy. When you have more and more renewable energy coming online,
0:26:53 that is more cyclical than traditional energy. If you’re burning coal 24/7, it’s a steady stream
0:27:00 of energy. If you’ve got wind or solar, it’s more variable. Also, with things like residential solar,
0:27:07 you have times when you may be wanting to allow those residential solar panels to send energy
0:27:13 onto the grid instead of pulling data for the house off the grid. All of those types of things
0:27:19 benefit from modernization and modernization in a way where AI can play a significant role in
0:27:26 helping to avoid waste and to create ways for the energy flow to be optimized. That’s one very near
0:27:34 term area where we’re seeing progress. A partner of our Portland General Electric is using AI and
0:27:39 using some of our products to do just that, to put smart meters around to help them manage the
0:27:46 growth in renewable energy. There certainly is a perfect opportunity right now for us to do this,
0:27:52 to engage in grid modernization because we have so much value to potentially unlock with AI.
0:27:58 We’ve got these big companies who are really good at developing infrastructure who are motivated
0:28:03 to help us modernize the grid and introduce more renewable energy and do that in a responsible
0:28:11 way. It’s a perfect time for us to be focusing on this. There are also fantastic other sustainability
0:28:18 related benefits from AI in terms of drug discovery for human welfare and materials
0:28:24 discovery for things like electric vehicle batteries and batteries more generally. We’ve
0:28:30 heard reports from Microsoft as well as from Google about discoveries they’ve made in material
0:28:35 science that could potentially lead to much more efficient batteries in the future, which of course
0:28:41 would not only save resources but also save energy as well. Right. It’s funny listening you talk about
0:28:46 sustainability and it’s obviously related but it makes me think about or when you mentioned about
0:28:52 residential solar and being able to send solar power back to the grid and thinking about the
0:28:58 virtuous implications of that. It made me think about recycling for whatever reason and recycling
0:29:04 being one of those things that individually if we all do it it’s forms of collective and then
0:29:11 obviously if we’re moving from residential to industry and large corporations and factories
0:29:18 and what have you the importance of recycling is sort of bigger obviously in these spots in a
0:29:24 factory than in an individual house. I’m wondering about the importance of collective action when it
0:29:30 comes to all of these things that you’ve been talking about with AI and sustainability and energy
0:29:38 demand and then sort of dovetailing from that. What about the role of industry or even of governments
0:29:45 in driving these kind of new and emerging best practices that will support sustainability for
0:29:52 all of us? I think there’s a great role for governments policy makers to play here in terms
0:29:59 of number one setting an example of how accelerating computing and AI can be used as tools for good
0:30:04 and we’re seeing some great work by regulators especially in the United States but also in
0:30:10 Europe and elsewhere where there’s a real appreciation of the potential benefits to society
0:30:17 and to sustainability and adopting both accelerated computing and AI and using that for the public
0:30:23 good. So I think that’s the first way in which I think there’s an opportunity here accelerating
0:30:28 all these workloads transitioning them over to a sustainable platform and then using AI to try
0:30:35 to benefit society in an environmental way and in a social way as well and then also to
0:30:42 encourage that type of sustainable deployment in industry as well and make sure that we’re
0:30:49 again modernizing the grid and creating the environment where we have a clear path towards
0:30:56 sustainable deployment of AI because ultimately you know this is moving very very quickly. This
0:31:01 fourth industrial revolution is we’re living through it. It’s very exciting and we don’t want
0:31:08 anything to undermine our ability to capture the benefit from this so that we can try to mitigate
0:31:14 climate change. We can develop new drugs. We can see all of the potential benefits from sustainability
0:31:19 and we can have that at the same time with sustainable deployment of AI data centers if we’re careful.
0:31:25 Absolutely. Josh, before we wrap I just want to mention the NVIDIA Deep Learning Institute is
0:31:31 actually sponsoring this episode. I just want to give them a shout out and listeners out there who
0:31:38 are interested obviously in AI but in AI and sustainability in particular head over to the
0:31:44 NVIDIA Deep Learning Institute. There are DLI courses on AI and sustainability. You can learn
0:31:49 so much more about what we’ve been talking about and go out and make an impact of your own which
0:31:55 we obviously encourage everybody to do. Josh, kind of as we wrap up here and along those lines
0:32:01 for listeners who would like to learn more about the work that you’re leading, the work that NVIDIA
0:32:06 is doing on sustainability, energy efficiency, everything we’ve been talking about, maybe even
0:32:11 some of the work that NVIDIA is doing with partners along these fronts. Where is a good place
0:32:17 for a listener to go online to kind of start digging in deeper to this? We are publishing more
0:32:23 and more content about the connection between auxiliary computing and AI and sustainability
0:32:31 on our website. We have a sustainable computing sub-page on our public web page. We have some
0:32:36 corporate blogs there, some white papers and so forth. Those were all really interesting and
0:32:42 very readable I think in terms of giving you examples of how NVIDIA and our partners actually
0:32:47 are doing so much good work in sustainability. We also of course publish an annual sustainability
0:32:53 report if you’re interested in the corporate level view, what we’re doing in terms of energy
0:32:57 efficiency in our systems and our own corporate commitments that’s in our annual sustainability
0:33:02 report which is on our website as well. Fantastic. Closing thoughts, anything you want to leave
0:33:08 the listeners to take with them or ponder when it comes to the future of AI, the future of
0:33:15 sustainability, made better by AI, anything else we’ve touched on? I just offer a healthy dose
0:33:22 of optimism. We’ve heard, we’ve heard I think an unhealthy dose of pessimism and skepticism about
0:33:29 AI specifically in the realm of sustainability and again those are all legitimate questions but
0:33:36 AI I currently believe is going to be the best tool that we’ve ever seen to help us achieve
0:33:42 more sustainability and more sustainable outcomes. If we capture this, if we capture the moment and
0:33:49 use AI for good and if we use this new auxiliary computing platform to drive better efficiencies
0:33:54 then we’re going to see really dramatic and positive results over time as we do that more and
0:33:59 more. That’s what we want and let’s end on the optimistic note. Josh, thank you so much for
0:34:04 taking the time to come on and talk with us and it goes without saying but all the best of luck
0:34:08 to you and your teams on the work you’re doing. It could be more important. Thanks Noah, really
0:34:19 appreciate it.
0:34:30 [Music]
0:34:41 [Music]
0:34:59 [Music]
0:35:09 [BLANK_AUDIO]
From improving energy efficiency to helping address climate challenges, AI and accelerated computing are becoming key tools in the push for sustainability. In this episode of NVIDIA’s AI Podcast, Joshua Parker, senior director of corporate sustainability, shared his perspective on how these technologies are contributing to a more sustainable future.
https://blogs.nvidia.com/blog/ai-energy-efficiency/