Transcript for:
AI's Impact on Sustainability and Innovation

Welcome to CSIS Online. Join us as we bring together top experts and thought leaders to discuss innovative ideas and real-world solutions to global issues, from security and economics to technology and environment. Tune in and be part of the conversation.

The rise of artificial intelligence could fundamentally change the U.S. strategies and perspectives for economic growth, for national security, and for economic competition. The idea that we're near a Promethean moment floats around in some sense. circles. And while we don't know the chance of that, there is a real sense that the potential for AI is incredibly large. What questions does this raise for our energy system?

Can we power the chips, the data centers that we need? to stay at the forefront of this very promising technology. I think we do that in a way that's consistent with our long-term climate ambitions, both for the United States and for the world. I'm joined today by an expert in this field to try and uncover what we know about this issue and what more we need to know and where both the business community and the government may go. Josh Parker is the Director of Sustainability at NVIDIA, I think now close to the world's most valuable company.

and chip maker du jour. And he's got a background as an engineer in intellectual property law, corporate sustainability. And I think at NVIDIA, it's his job to help resolve some of these tensions that both policymakers and others are seeing start to rise. Josh, welcome to CSIS. We're very pleased that you're here today.

Thanks, happy to be here. I want to come right out of the gate and just get your sense of the state of the industry, the sense of optimism that many of us read. read about. We read about tech executives. Coming to DC and saying we're going to have five gigawatt data centers, not just one, multiple.

The role that AI is going to play in our economy can seem at times unbounded. How are you thinking about the potential that AI brings and what's the role of NVIDIA in meeting that potential? I appreciate the way you framed that because I think there is great reason to be optimistic about The impact of AI ultimately on sustainability and on the key sustainability issues that we care so much about, especially climate change.

And there's a lot of different reasons for that. One of them that's kind of top of mind, but that's difficult to quantify, is the applications that use AI that we see now. We have some line of sight to what AI is going to help us solve in terms of sustainability.

It's really, really good at optimizing problems. So you throw out a bunch of data and you say, for example... Here's a traffic situation, help me optimize the traffic lights to reduce emissions and reduce stopping, makes everything more efficient.

Things like that AI is really good at, and that's some of the low-hanging fruit we're starting to see. And there are second and third order impacts that are very positive, like material science, helping us develop new material science for EV batteries and things like that. But the energy efficiency of the AI platform itself is a big, I think, missing piece. piece and a lot of the analysis around AI's impact, potentially negative impact, which has gotten a lot of the headlines now. I'm happy to go into detail about that, but overall, long-term, I'm a firm believer that it's going to have a net positive impact on sustainability.

So when, you know, NVIDIA, you know, five years ago, I think a lot of people, if they knew the name NVIDIA, they knew it as a gaming device. Yeah. Right? Yeah. Let's start at the high level.

What is different about the technology? that NVIDIA makes that has unleashed this potential revolution in computing. And then I think we can get to what that implies for energy and how we're going to use it around the economy. Sure. So, yeah, we were founded as a company that was intended to help accelerate graphical processing.

And so the graphical processing units, the GPUs that we started developing, were designed to do math very quickly and very efficiently. And so they take mathematical equations and they run a lot of them in parallel, so many of the equations at the same time. And that's different from a traditional CPU which is great at general-purpose computing, but for the specialized task of running math very quickly, which is what you need to do graphics, a specialized processor greatly speeds up that process. And we recognized early on, a long time ago, that that same type of mathematical processing could lend itself to machine learning and AI.

And so back in 2006 is when we started developing the software to really unlock that potential in our GPUs and started developing that ecosystem as much as we could so that we would be ready when the software and the data arrived to help enable the AI revolution that we're seeing now. So it's a companion to a CPU where we couple a CPU, and we actually design our own CPUs now, we couple that CPU with one or more GPUs. and they do the hard work, the heavy lifting, the real math that ends up being really effective for AI. So the general idea is we get really, really good at doing certain calculations more rapidly than we were able to do before, or we can do a lot of them at the same time, and that unlocks just a whole new sort of set of tools that you can use for applications, right? My son actually did ask me to congratulate you on the ray tracing.

Apparently it's excellent. But suddenly in application, you can use these tools kind of around the world, right? Now, the thing that are throughout the economy, the thing that is now very familiar are large language models.

But I'm hearing you talk about applications like much broader than writing funny limericks. Yeah, for sure. That gets a lot of play because that's the easiest way for average people to interact with AI. You can go on Cloud or Gemini or ChatGPT and have fun.

And it is really, it's a fun little tool in that context. But it is really, really powerful when you turn it to commercial applications. One example I like to talk about is we've got a partner.

So the company Foxconn does a lot of electronics manufacturing based in Asia. They're opening a manufacturing facility in Mexico and they used our hardware coupled with some software from Siemens to create a digital twin, the manufacturing facility, and they're projecting that the manufacturing facility will save 30% of its energy because they used AI to optimize it on the front end. So these industrial applications and I mentioned transportation, because those are such big sectors there's huge potential for AI to have a net positive impact across the board because of that that real potential for optimization there. I would like to start on the optimistic side. We can get to the challenges of meeting the energy supply.

This is a challenge that I'm sure you're very very adept at talking about. But, and have, I know you have really sophisticated thinking on it, but let's talk a bit about on the application side, because I think, you know, it's easy to get tied up in the problems, but I want to talk about the sort of the use of this tool for our broader challenges. What are the key areas when you kind of just think for yourself or stuff NVIDIA is already helping out with, do you think are going to help us resolve sort of broader energy and climate or sustainability challenges? So the energy actually turns out to be one of the easier things to do with AI because, like I said, optimization of a problem.

If you create a digital twin of a facility and say, I want to optimize this for energy, that's one of AI's superpowers. Like, help me solve for a variable or a set of variables, make this more efficient. So energy efficiency is really low-hanging fruit, and that's what's really exciting because energy is, a lot of the conversation right now is about energy. But another corollary to that is, again, if you're looking in the digital twin context, you can actually train robotics in a digital twin environment. We're doing this with our Omniverse platform now, and have a lot of the work that would otherwise be done in the real world, using real resources, do it virtually.

And you end up having a product where you're conserving a ton of resources because you're doing all that hard work in the virtual world. And so once you're ready to build your prototype, your AI is already trained, your robot is already trained, and you can optimize the resources as well. So that's another example.

And then looking at things like climate change, there are direct impacts on potential technologies there like carbon capture and storage. Some companies are using AI to help find better sites and to discover better technologies to be more effective at carbon capture and storage. So we can, you know.

go directly for the impacts of climate change and start mitigating that as well with AI. I hadn't heard that example before, but the idea generally makes sense, right? So we think a lot in the climate mitigation space around like there's just high uncertainty around what the set of solutions is, right?

And we do innovation policy and we have regulations to help kind of find local minima in cost or in... or Maxima and what is most convenient and provide services in a low carbon way. What I'm hearing you say, just from like a sort of mathematical perspective, is you can do a lot of that using these computational tools enabled by artificial intelligence and parallel processing to like have a bunch of versions of the world, test them out, and sort of get a sense of where we should be searching for more efficient, less carbon intensive solutions.

Is that roughly right? Yeah, that's fair. And I think the Boom in AI, this revolution that we're living through, is actually one of the reasons why we've seen such a proliferation of climate tech startups because this new tool and a lot of versions of this, some frontier models you know like Meta's Lama 2, are open source and you can start unlocking the power of AI for very low overhead or no overhead and apply it to some of these big problems. So I think we're seeing Rapid innovation in the climate tech space, specifically because we have this new tool that people are eager to use across the spectrum. We have a startup support program called Inception at NVIDIA, and a portion of that is called Sustainable Futures, where we try to enable specifically folks in the climate tech space.

And we've got hundreds of startups in that program that are using our hardware and software to try to solve those big problems. Where should we look for that first? Where do you expect to see large-scale, say gigaton-scale solutions emerge from the use of AI in reducing greenhouse gas emissions?

So apply it to the climate problem. So if we're talking about pulling emissions out, things like carbon capture and storage, there are technologies across the spectrum. And it's hard to predict which one's going to end up being the most useful because we're still… early stages in kind of applying AI to that problem. One thing that we're already starting to see is utility in AI to help with climate modeling, which is good in terms of helping people get their arms around climate change, understand and its impacts near-term and long-term and also help us with climate change adaptation of course and to avoid the worst impacts of climate change.

We actually have a platform that we're developing called Earth2 which is, we talked about digital twins in the industrial sector, it's a digital twin of the earth. And using AI, we can generate weather predictions a thousand times faster and 3,000 times more energy efficiently than previous physics-based simulations. So it's a huge enable. It's just a step change in terms of what the technology can do to help us forecast weather and ultimately climate. I think that's going to lead to new insights and things like what we can do in the atmosphere to potentially help mitigate climate change.

But like I said, it'll also help us in the near term just adapt and try to minimize the potential damage. So, you know, what we can do in the atmosphere, I have to link onto that sentence. What do you mean? Well, you know, there are lots of ideas. ideas about what can be done with climate change.

One of those ideas is potential aerosol use in the atmosphere in targeted locations. So it's because climate change is such a complex problem, understanding the ocean's impact, the atmospheric impact, how all the CO2 and other greenhouse gases interact, those are huge problems and that's what math and that's what AI are really good to help model so that we can figure out the. the potential effectiveness of all of these options on the front end before we actually try to implement them.

Super interesting. You know, earlier in my career, even before I was here at CSIS, I spent a lot of time looking at kind of what should research programs look like from the federal government with respect to climate intervention, both direct air capture and using solar radiation management or aerosol injection, and the computing challenge that we face trying to understand. Not just the first order, but the second, third, tertiary effects of that kind of thing is pretty large. Yeah, it's really complex. And then on the other side, one can imagine that you use these tools on the engineering pieces as well, right?

Using that example, nobody actually knows how to make a nozzle that shoots aerosols at the right particle size distribution, at the right flow rate, and new simulation tools. tools probably would enable us to do that if we're just sampling so many different versions of designs. Yeah, that's true. When should we expect these kind of these like fundamentally different qualitatively different solutions to break out? Right when like we're not and you know, not necessarily the man on the street, but sort of the informed observer, when should we start seeing these tools really driving impact?

That's really hard to predict because we're talking about potentially revolutionary. technologies, no precedent for them, and the real issue is when is the technology going to get good enough where the cost curve comes down for it to be feasible to do this at scale. We can do carbon capture, it's just a question of size and cost that we need to try to manage.

So in terms of specific technologies to address that, it's really hard to say. The thing that's more predictable and that we're already starting to see is of course AI's impact generally on energy. to help reduce it and to help us solve some more discreet problems that are contributing to climate change for example you know reducing emissions in traffic reducing energy consumption and manufacturing those are you know we've got real use cases and they make us I think legitimately optimistic about the potential for this across the spectrum are those perspective or do you are you confident that like companies are already changing investment profiles like traffic engineers are already shifting light timing using these tools So we're absolutely seeing companies doing this right now, implementing it, adopting AI, finding ways to reduce energy consumption.

Traffic is still, from what I know, in kind of a pilot phase, but it's definitely something that's going on right now, like pilot phase, and they're showing real results. But, I mean, that's why NVIDIA, why our products are so popular right now, is because these big companies and small companies... companies alike are seeing real value in it. And it's worth the investment to buy the equipment and to start pursuing it.

So, you know, maybe, you know, I'd love your thoughts on this question. One thing I've always just been curious about, right, is you can imagine these technologies being used throughout the economy, right? For different purposes. We're going to do nozzle design for aerosol distribution.

We're doing traffic engineering. We're making sure that this, you know, we're doing reservoir engineering to try and find where we can put CO2 in a direct or capture facility. All super cool.

Do we need different AI models kind of working on each of those different problems? Do we have like a sort of generalized model that can address a lot of those different challenges? And are those challenges different enough that you think like all these different industries are going to have unique applications?

And the reason I ask that question is I'm sort of thinking, all right. How do you train artificial intelligence across all these different sectors and use cases? So we are seeing with the evolution of frontier models, of large language models, that they are becoming multimodal. So they can interface with different technologies, text and images and video and everything like that.

And they're becoming much, much more powerful. And this is one of the really cool things about AI. We talked about chat GPT and cloud.

cloud and everything and how the average person can see how cool it is and at least how fun it is. If you look at the capability of the frontier models now versus just six months ago, they're so much better than they used to be. And I think a lot of people don't realize how rapid the innovation is and the advancement in the models themselves in that large language model context. So those frontier models, I think, are going to become more and more useful just because they're doing their job better. better.

And so they do have the potential, I think, to play a significant role in the sustainability space in climate change as we apply them. But there's definitely a role for specialized models to play as well. Some of the problems that we were talking about in modeling, weather, for example, requires so much data and specialized data.

So I think there's always going to be a role for the large frontier models to play and also the more specialized models. specialized ones to help solve targeted problems. And how much is this a sort of like, you know, Silicon Valley driven, U.S. driven conversation versus an international one, right? There's economic benefits here.

There's potential climate benefits. You know, how are you looking at the global situation? It's definitely an international interest. A lot of investment is happening in the United States.

You know, we have some very innovative. very well-resourced companies here that want to take advantage of AI. So they're doing some of the most aggressive development of AI. But we're absolutely seeing interest across the world internationally for AI deployment.

A lot of countries want to develop their own domestic AI to support their economies and to help them with social and economic and environmental problems locally. So it's definitely a global interest. One question. One cannot realize all this progress without perhaps a fairly significant energy bill.

Some of the numbers that you see coming around Washington can be like eye-popping, right? Five gigawatt. Single gigawatt data centers are already like the norm.

Five gigawatts is floating around town this week. 10% of the grid in 2030 is a number that we've heard. And the energy intensity of training.

and using these models is on the minds of policymakers, as you well know, right? The idea that it takes ten times as much energy on a lifecycle basis to do a LLM query versus a Google search. How are you thinking about that challenge, right?

And how should we be thinking about that challenge? Because the more promising opportunities that are unlocked by these technologies, also presumably the higher the energy cost will be. understandable why people are focused on this, because if you look at the rapid deployment of AI and think about the energy that's involved with each AI server, then you can wonder, okay, is this sustainable?

Where are we going with aggregate energy demand? But again, there is a lot of reason to be optimistic. And one of the key reasons for that is the very, very rapid increase in energy efficiency in accelerated computing. computing and in AI that we're seeing.

And I don't want to drown you with numbers, but... I can take it. Let's go.

Okay. A couple of them are really compelling. So I'll start out with looking at in the past two years. So the platform that we've been shipping in volume this year is called Hopper. And a very successful, very efficient platform, light years ahead of where we used to be.

The next platform that we've already announced. is called Blackwell that we're starting to ship, I think, next quarter. That, just in that one generation improvement, we've reduced the energy consumption for AI inference by 96%.

So it's 25 times more energy efficient to run AI inference on the next generation of our product than the previous generation. Now, pause for just a second, because I think actually as we're all learning about this, we actually need to get into the fundamentals a little bit. So when you say AI inference, you mean like you're training a model, so every sort of iteration of that training takes one twentieth the power? Or when I'm using these tools in application?

It's sort of 1 20th of power or both? It's when you're using the model. So yeah, the inference stage. So you train up the model.

Sometimes you do fine tuning along the way afterwards. But the model, once you train it, and then you're interacting with it. So when you're actually talking to ChatGPT or Cloud, et cetera, that inference stage is where we see a lot of kind of the growth going. And that portion, we're 25 times more energy efficient. On the training side, it's still really dramatic.

It's a four times improvement in training. So we're using just 25% of the energy to do training that we were two years ago. So really dramatic there, but the inference is just eye-popping.

And that, people ask, you know, is the energy efficiency plateauing? Are we reaching diminishing returns there? And we don't see that happening.

In fact, over the last 10 years, we've had very, very consistent. and dramatic improvements in energy efficiency. So if you look back 10 years, we're 100,000 times more energy efficient for AI inference than we used to be. It's hard to wrap your mind around those numbers, but when you think of, okay, yeah, energy is going to go up because the use of AI is going to go up, that's a huge countervailing force and variable to say, okay, the use is going up, but the efficiency is going up so rapidly.

that it's hard to know exactly where the steady state is going to end up. Do you, I mean, when you think about the growth of the use of these tools, are they sort of, is it supply limited, right? So if it gets 20%, if you get 95% cheaper in an energetic sense to do this, suddenly you might want to do a lot more of it, right?

Or is it sort of demand driven in that, you know, we're still discovering how... people, firms are going to use these tools on a day-to-day, high-frequency basis? I think there's some of both. So we see, you know, our CEO, Jensen, has talked about wanting to drive the incremental cost of computing down to zero, approximately. And so if you make it more accessible, people are going to use it more.

It's going to be cheaper to use it. And the neat thing about that is if you get access to these really powerful supercomputers that you never had access to before, you're able to... do things that you couldn't previously. And so if you can start using multi-billion parameter models, they can do things that a million parameter models couldn't. So it's unlocking new potentials for AI, and that's one of the exciting things about being at this inflection point in this fourth industrial revolution, to see what can we do next, because there's step changes in the potential capabilities of the technology.

What are the things that worry you about realizing this potential, right? And that could be a matter of governance, it could be a matter of societal acceptance, I don't know. What do you think are going to be the big blocking agents? So near term, I think the discussion around energy is really the most risky. And not that there's a huge risk here, but when ChatGPT burst onto the scene a couple years ago, it took most of the world by surprise, right?

Right. It was wonderful and surprising and everybody started rushing in saying, oh wait, there's value here, like AI is working. And so everybody rushed in, wanted to start developing this technology. Not that companies weren't doing it before, but this was a new, clear demonstration of the value. And so we've seen a lot of investment, of course, in AI since then.

So it makes sense that in the near term, there could be some supply and demand. mismatches, right? Because it was unforeseen. And I would, you know, we were talking before the livestream started, there's also just a timing issue, right? It just seems like there's a lot happening very fast.

Firms are in a race against each other. We're in a race against other countries around the world to be at the forefront of this, to capture the benefits early on. So it also seems to me like there's a timing challenge there that like the power system just adjusts more slowly then.

Then the tech industry could potentially grow. That's right, yeah. Software moves very quickly. And infrastructure, especially energy infrastructure, does not.

So yeah, there are some near-term challenges to try to... to match the energy with the demand for AI. But that type of challenge is one that we've solved in the past many times.

AI, you know, estimates are that it accounts for, well, total data center energy consumption accounts for maybe 3% of global energy consumption right now, and AI is a fraction of that. So if you look at it in the context of global energy consumption, it's tiny still. Definitely room to grow, especially if there's a lot of value there.

And again, if it has a positive impact on energy and other sectors, then it could have a net positive impact on the grid overall. Well, and I also argue we spend a lot of time here at CSIS and folks in Washington spend a lot of time thinking about the broader electrification of our economy, right? The electrification of the light duty vehicle fleet already was going to require us to double the grid. More linearly, more slowly, yes, but we knew we had to do that.

And so one of the things that I think is kind of most interesting from a public policy perspective is the degree to which the solutions we may need to pursue in making sure that we stay at the forefront of AI nationally are the same ones we probably needed. to make sure that we were going to meet our climate goals and increase the role of electrification in our society anyway. I mean, that's kind of my feeling anyway.

I don't know what you think about that intuition. Yeah, I agree. We knew that our infrastructure, it's not, you know, other countries have more modernized grids than we do.

There's a huge opportunity for us to modernize it, and the development of AI is actually a fantastic catalyst because... There's such broad support for deployment of AI. For example, in Congress, a lot of everybody wants to get this right, and everybody wants to have AI deployed domestically.

And, you know, you've got electrification and reshoring in the mix as well that are driving larger energy growth. But the value and the commitment to deploy AI in the United States is helping to focus attention on that gap. between the forecasted energy demand and the infrastructure.

And I think it's a fantastic opportunity for us to modernize the grid, to make the investments in the infrastructure now, to help us have a smarter grid, one that has bigger capacity, one that has better access to carbon-free energy. Integrating residential solar onto old grids is a challenging problem. But if we use this moment to say, we're going to make it happen, And there are big companies involved in this that have a lot of expertise. You know, Google and Microsoft and Meta and AWS, they all have huge interest in developing more clean energy, having an upgraded grid, modernized grid. And those will be fantastic partners as we, you know, partner between the private and the public sector to say, what can we do to capture this moment to modernize the grid and make those investments now?

So. We're getting close to the end of our time, but there's a couple more things I'd love to pick your brain on if you can go a few extra minutes. Sure. One of the challenges we face in the global response to climate change is U.S. and developing world emissions are increasingly less of the story.

and a lot of growth is happening in emerging markets, developed markets. What role do these digital tools play, in your view, globally? When we think about, oh man, we have very sophisticated systems for developing and financing renewables projects, increasingly low-carbon firm resources in the U.S.

That's a lot harder to do in emerging markets and developing countries. Do you see these tools being useful in that context, sort of the... the things that AI can enable?

Yeah, they will be useful. Now, AI is not fusion. It's not cheap, unlimited energy. So those countries are still going to have incentives to deploy the lowest cost energy. So it's not going to be a silver bullet for that challenge.

But it's absolutely going to help because, again, AI is making it so much easier. To find energy efficiencies and to deploy renewables more cheaply, integrate them into the grid more cheaply, optimize traffic, all of these things that will help reduce the emissions in the developing world as well as in the developed world, it's equally applicable there. It doesn't solve the problem, but it definitely gives us more time. Asking for your editorial view, brief lightning round, which is going to help resolve our problems more. Large-grade infrastructure or new nuclear, when you think about what AI and its potential is going to...

Unlock oh that's hard to choose. I mean it's I think It's hard not to use every available option right now because we're in a period of you know growing demand and we're not accustomed to the demand, at least within the last 15 years. We were accustomed to it up until then.

I think we should be pursuing every possible option for clean energy, especially when it's firm. So, yeah, I think. pursuing all the options on the table makes sense. And radical efficiency reductions in the sort of energetic or other costs of doing AI, or the radical efficiency reductions you think it will enable in other parts of society? Well, the overall impacts in the near term, the potential is larger in the other sectors, just because they're a bigger share of the pie.

But AI is here to stay. And over time, you know, especially if we continue on this energy. efficiency cycle, it really is going to drive the cost of computing down close to zero and continue to, I think, drive just amazing technologies and advances for society. So I think that'll be significant as well.

And then is there anything else that, you know, as we come to a close, you think is important for us to hear here in Washington? I would say let's capture the moment. It really is a unique time. Our goals for sustainability are uniquely...

aligned with economic incentives because we have this new revolutionary tool in AI. We have a way to implement it. We just need to make sure we're kind of building the infrastructure to support it and then taking advantage of it right away because we know we've got climate change challenges and it's urgent. And any delay that we have in deploying this technology will end up, I think, potentially causing more problems. let's get it out there.

Let's use it. Let's demonstrate its value. And we can start tackling those bigger sustainability challenges with it.

Awesome. And lastly, how should think tanks use AI to do our jobs better? I haven't thought about that specifically, but AI is fantastic to get educated on simple things. I use it that way.

If there's a new topic, it's excellent at crunching data. So I think the low hanging fruit is with those frontier model LLMs. help.

But if you're doing more sophisticated modeling of weather and energy and so forth, I think more specialized models could help there as well. Well, Josh, thank you so much for joining us. I mean, as I said, this is an area that our audience, that Washington is still dipping its toes into and having to learn about.

And it's really helpful to hear from someone who's so close to the issues, can help us understand the numbers, can help us understand where the big wedges are in meeting both the opportunity and... and the challenges that new technology imposes. So thank you very much for being with us today.

It's been a pleasure. Colleagues, I'm so grateful that you joined, and we were able to interpret a couple of questions that came in online for Josh. This is Joseph Miget with the CSIS Energy Security and Climate Program, signing off.

Until next time.