When US Energy Secretary Jennifer Granholm was creating her advisory board, she tapped John Dabiri, the Centennial Professor of Aeronautics and Engineering at Caltech. With his expertise in technologies of sustainability and his interest in AI (Dabiri has been on the board of NVIDIA for four years), Dabiri joined the Secretary of Energy Advisory Board (SEAB) to advise Granholm on US energy policies and how these may affect economic and national security policy. The SEAB recently undertook a project to better understand the sharply increasing demand for energy created by artificial intelligence (AI) and how the US can support these increased needs in carbon-neutral ways. The SEAB's report, available online and described in a recent Department of Energy (DOE) press release, provides recommendations for how the DOE can partner with industry to, in Secretary Granholm's words, "win this moment with clean energy while ensuring an affordable and reliable power grid for all consumers."
Dabiri recently reflected on the report and on the problems and possibilities associated with powering artificial intelligence.
Why is there a concern about the energy demands of artificial intelligence?
This new wave of AI can do some pretty amazing things, but it is more energy intensive than the typical way you've been interacting with your computer or phone over the past couple of decades. Think of ChatGPT, for example: If you ask it a question, it has to generate information on the fly based on the very large language model that it is built upon. This uses about 10 times as much energy as, say, a Google search, where you are simply retrieving information from the web. It also takes a lot of energy to train those models to begin with.
When you think about AI models where you can put in a text prompt and have your computer generate images or video or three-dimensional graphics, the energy costs can be significantly higher. AI and large language models (LLMs) as we know them today require very large data centers, from hundreds of megawatts to even gigawatts in scale.
AI projects need more energy, and they need it as fast as possible to sustain the current pace of innovation. At the same time, we have to make sure that our growth in energy is sustainable, that it relies on renewable resources, and minimizes the use of fossil fuels.
Why is this a particular concern for the DOE?
We want new AI models to be developed and powered here in the United States. If AI companies go overseas to develop their models, all that investment happens over there and not here. On top of that, the energy that companies would be sourcing overseas is likely to be more carbon intensive than it would be if we were able to generate that energy using cleaner sources here in the United States. We'd like to avoid those negative environmental impacts if possible.
The risk of companies offshoring their AI could be greater than for traditional industries. It's one thing to move manufacturing overseas when you have a physical infrastructure that requires you to ship parts or products back to the United States. AI training is easier to offshore in principle because you're mainly just moving bits of information. On the other hand, this portability of AI also means that we may have opportunities to locate projects in the US more creatively than we can site conventional energy loads and with less reliance on the existing grid.
At what stage in their development do LLMs, for example, need the most energy?
Training AI models requires a lot of energy. Previous forms of AI—language processing, image processing—haven't required such large models as we are seeing now, which have trillions of parameters as variables that the model uses to make predictions. The algorithms that train these models run on a massive amount of data for months at a time. That requires a lot of electricity.
But inference, which is when AI models are answering queries, is also going to scale up as more people use these tools. When ChatGPT or other tools online have millions of people querying them, each of those queries represents an additional energy cost that we must find the resources to power.
Do we have to invest more in energy to support the future of AI, or can AI itself be made more energy efficient?
Both—definitely both. In fact, there were three focus areas in the DOE report, and I led the team that looked at energy efficiency and power dynamics in LLM training and inference. We need to ask whether future improvements in the algorithms of LLMs might help to slow the growth of energy demand. Are there ways we can think about optimizing how these systems are trained, making them more energy efficient? Could training be disaggregated to multiples sites so that the energy demand is not as concentrated geographically?
Then, on the side of inference, can we dynamically route queries to data centers where the energy costs are lower or less carbon intensive or where there is a lighter energy load at the particular time of day when you are asking the AI model your question?
The more predictable training and inference loads are, the easier they are to power efficiently. And if those loads can actually be reduced during times of limited power availability—say when a particular part of the country is using a lot of energy for air conditioning because of hot weather—that would be great.
Making data centers more flexible is going to require better collaboration between utilities and data centers and end users. That was one of our team's key recommendations to the secretary of energy when we met last spring, that stakeholder gatherings should be facilitated to get these different groups working toward common solutions.
This work is already happening. Just last week, the White House held a roundtable on US Leadership in AI Infrastructure that included heads of AI companies, data center operators, and utilities officials to ensure that the United States will remain a leader in AI.
On the other hand, we also need to be able to create more energy. That requires public/private partnerships. In terms of sustainable energy in the United States, there is already a very long queue for new energy generators to made operational. This queue represents something on the order of 2 trillion watts, so we are talking about a huge amount of power that is not yet available for use.
We need to figure out how to streamline the process of adding new generators so that it doesn't take five or 10 years but can be done more quickly while preserving due diligence. You want to make sure that a new energy generator is not going to destabilize the grid. You want to make sure that all environmental impacts are considered, because every project will have some sort of environmental impact. And you want to make sure that certain communities aren't being unduly burdened either by emissions or by having the cost of their energy increase to pay for infrastructure improvement. Communities shouldn't have to bear the cost of new data centers if its benefits are going solely to a company that is just swooping in to harvest its power.
Will new energy generators benefit local communities in any way?
They will certainly create jobs, at least during the period they are being constructed. There will also be ongoing maintenance jobs as well.
There has been discussion of potentially repurposing old coal mines as data centers, not to use coal as fuel but because these mines have a lot of the necessary infrastructure for energy distribution already in place. They also have the real estate, the square footage, required to install data centers. And the hiring needs to staff these centers could help to revitalize communities so that they're not left behind in an energy transition.
In fact, Constellation Energy just made a deal to revive the nuclear reactor at Three Mile Island with Microsoft pledging to buy that power for the next 20 years to fund their ongoing AI efforts. Developments like this will reinvigorate important unresolved conversations about the role of nuclear energy in decarbonization more generally.
How sanguine are you that these problems can be solved, that a carbon-neutral energy grid will be able to keep up with the demand being placed on it by LLMs?
These are difficult problems to be sure. It's hard to project what the future energy need will be. Imagine building out a huge energy infrastructure and then finding that it's not really required. Those excess costs would get passed on to consumers. On the other hand, if we don't create enough new energy generators, we run the risk that I mentioned before—that AI companies will look overseas and that those energy plants may rely too heavily on fossil fuels.
Where I'm most optimistic is in regard to the algorithms themselves. I think they will become more efficient and more effective than they are today, and probably sooner than we think. We're not even two years past the initial public release of ChatGPT, and there has already been so much innovation.
Three or four years from now, I think we will see definite improvements. There is a lot of economic and societal value to be had from AI. That's why you're seeing so much investment from Microsoft, Meta, Amazon, NVIDIA, and really all the big players in tech. In addition, there is a huge ecosystem of startup companies working on various aspects of AI. This increases the odds that at least one of these efforts will allow us to further reduce the energy needs of AI.
My team at SEAB recommended that the DOE establish a data-center-scale AI testbed that will allow researchers the necessary computational resources to work on algorithmic efficiency. We're more likely to resolve our energy challenges if it's not just the big tech companies who are working on this. The broader research community has tremendous potential to generate the necessary breakthrough ideas, but they'll need a place to test those ideas before they have the commercial value that would convince a large company to test them. The DOE is a natural home for this type of effort. For example, its facilities for high-performance computing have greatly enhanced pre-commercial research into everything from COVID-19 to climate modeling.
My perspective is optimistic, but I think it's a well-founded optimism that the collective creativity of the research community will help us address the need for energy to power AI. At Caltech, we have faculty, like Adam Wierman [the Carl F Braun Professor of Computing and Mathematical Sciences and director of Information Science and Technology], who were working on sustainable computing long before it was the well-recognized problem it is now. We need to be sure that researchers like Adam have the resources they need to make rapid progress.
The ultimate value proposition with AI will lie in what AI is actually able to do, whether that's in drug discovery or weather forecasting or new materials design. The DOE is also interested in how AI can help to accelerate the deployment of the very energy infrastructure that I mentioned earlier, for example, by optimizing the grid and streamlining permitting. All of these are important tools for improving society. This is where the value of all this capital expenditure will ultimately show itself.