Imagine developing a big solar project and finding that getting it permitted will involve navigating federal, state and local regulations, each of which uses different terminology and data, making the whole process complex and time consuming.
Now imagine having an artificial intelligence (AI) program that can organize and consolidate the various requirements of those regulations and identify the information that can be used across all of them.
Streamlining and accelerating permitting is just one of the potential uses the Department of Energy envisions for AI to accelerate the U.S. power system’s transition to 100% clean energy and the modern, efficient, secure grid needed to reach that goal by 2035, according to two new reports DOE released April 29.
AI for Energy: Opportunities for a Modern Grid and Clean Energy Economy looks at the near-term potential for AI to speed up, streamline and improve system planning, project siting and permitting, operations and reliability, and resilience.
The report provides laundry lists of possibilities in each of these areas: for example, using AI to model the adoption of distributed solar and storage projects or virtual power plants to forecast impacts on load and load shape, as well as when and where distribution system upgrades will be needed.
Other potential applications include:
-
- optimizing the planning, permitting and siting of electric vehicle chargers and supporting vehicle-to-grid charging to provide grid support services;
- optimizing energy use in buildings and developing models to predict buildings’ energy load shape, future consumption and coordination with the power system; and
- accelerating environmental reviews by extracting information, drafting documents and automating compliance checks.
The second report, Advanced Research Directions on AI for Energy, explores longer-term opportunities and challenges, such as the information and workforce that will be needed to build the specialized AI models required for “dynamic coupling” of dispatchable generation with renewable and other variable generation.
“These models must account for the varying and unpredictable nature of renewable resources over time and space,” the report says. “At the plant level, adaptive … models based on real-time measurements are needed to enable rapid adjustments to the system controls, which is essential for managing the changing dynamics of energy supply and demand.”
The reports are part of a larger DOE drive to develop such AI models and other resources to adapt the uses of the now-omnipresent technology to advance President Joe Biden’s targets for decarbonizing the grid by 2035 and cutting U.S. greenhouse gas emissions across the economy to net zero by 2050.
Biden issued a broad executive order on AI on Oct. 30, which gave DOE a six-month deadline for producing a public report on the potential uses of AI for energy and for developing applications to streamline permitting and environmental reviews.
“Artificial intelligence can help crack the code on our toughest challenges, from combating the climate crisis to uncovering cures for cancer,” Energy Secretary Jennifer Granholm said in a press release summarizing DOE’s progress on these and other AI initiatives called for in the executive order.
DOE is ramping up its work on AI “on multiple fronts to not only keep the U.S. globally competitive, but also to manage AI’s increasing energy demand so we can maintain our goal of a reliable, affordable and clean energy future,” Granholm said.
Among its other efforts, DOE is providing $13 million in funding for a new VoltAIc Initiative, which aims to develop AI tools for streamlining permitting and environmental reviews of clean energy projects and infrastructure. DOE has partnered with the Pacific Northwest National Laboratory on one such tool, PolicyAI, an AI test bed specifically focused on environmental reviews under the National Environmental Policy Act.
DOE has also formed a Working Group on Powering AI and Data Center Infrastructure, which could be issuing recommendations in June on meeting the power demands of AI and other data centers, according to the DOE press release. Another upcoming study from the Lawrence Berkeley National Laboratory will analyze the regional energy and water use of data centers across the U.S.
AI ‘Hallucinations’
From search engines to popular consumer apps — Amazon, Trivago and Airbnb — AI has become inescapable, although the technology is not completely debugged.
As defined in Biden’s original executive order and U.S. Code, AI is “a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments.”
AI applications are built on “foundation models,” which are “trained on,” or fed, massive amounts of publicly available data — generated by humans or machines — which can then be tapped for a variety of uses, depending on the prompts used or the questions asked. The drawback is that if an AI model doesn’t have the information to answer a question, it might “hallucinate” and provide an answer that sounds authoritative and convincing but is completely wrong, said Jeremy Renshaw, senior technical executive at the Electric Power Research Institute (EPRI).
“It’s not like you can take one of the models, say ChatGPT … and just provide a bunch of prompts to it, and it’s going to get the right answer every time,” Renshaw said in an interview with RTO Insider. “It just doesn’t work that way. The tools are very powerful, for sure, but they can’t do everything. If you understand how to use them, and you find the right prompts or input questions, you can get better responses.”
Given the complexity of the electric grid itself, both Renshaw and DOE acknowledge that building foundation models for the energy sector could be very difficult “and further worsened by the evolving dynamics of climate change,” according to the AI for Energy report.
“Bridging the gap between the wealth of industry data that exists and the limited ability of the research community to access it remains a difficult task,” the report says. A figure in the report shows the multiple data streams ― on load forecasts, algorithm codes and equations, regulatory standards and risk metrics ― that must be “orchestrated” to create such a model.
Renshaw explained it in less technical terms. “AI is a data-hungry machine,” he said. “So, the more data you can feed into it, the cleaner, the better, the higher-quality results you can get from the models. We have lots of grid operational data we can feed into models that can then understand the physics or patterns within the grid, and from that we can get … closer to things like optimal power flow or automated grid management.
“They may still be years away, but that’s something that would be very impactful and very useful for the grid,” he said.
The Advanced Research Directions report estimates that developing foundation models to support grid planning, operations and security will also mean putting together well-coordinated, interdisciplinary teams. The roster could include about 100 AI and data scientists, another 100 power system engineers and analysts, 200 software engineers and 100 cybersecurity professionals.
While the size of individual teams could vary “depending on the size and scope of the [model], adopting a comprehensive approach involving these various skill sets is necessary to building confidence and accelerating momentum in the progress being made,” the report says.
Utilities’ Incremental Path to AI
U.S. utilities are, by nature, risk-averse, so while many are now adopting AI, their initial applications appear to be supporting traditional operations, rather than advancing system decarbonization, for example, by improving renewable energy interconnection processes.
Speaking at an EPRI seminar in March on demystifying AI, Chris Le, analytics product manager for Exelon, described some basic ways the company and its utilities are using AI. Exelon has developed a machine learning model to crunch the company’s extensive data on power outages and the time it takes to restore power, Le said.
Machine learning is a kind of AI that uses algorithms and statistical models that can be applied to perform complex tasks without explicit instructions.
In Exelon’s case, the company has been able to improve its reporting on estimated restoration times “by 900% within the 2-hour window, which is what most customers care about,” Le said.
Another application has involved training an AI model to identify potential defects on the distribution system from aerial photography, he said. “We’ve trained models to achieve successive capabilities for us — first, just identifying components in the photos … [then] identifying specific defects on those photos and then, finally, determining defect severity based on our internal ranking system.”
But AI is intruding on utility planning with increasing urgency via the proliferation of data centers across the country and their growing demand for power, largely due to AI.
The AI for Energy report cites work currently underway at the Berkeley Lab, which indicates “that over half of data center load growth in recent years may have been due to AI, and it is expected to be the biggest driver of U.S. data center-related load growth in the near future.”
Some utilities have responded to the data center boom by arguing for new natural gas plants to ensure supply and system reliability. The fast growth of data centers in Northern Virginia is a key factor in plans by the state’s largest investor-owned utility, Dominion Energy, to build new natural gas plants, according to coverage in the Virginia Mercury.
But Renshaw and DOE both note that data centers and AI developers are working on reducing their substantial carbon footprints. Industry leader NVIDIA recently launched its new Blackwell platform, which it says will provide supercharged AI capabilities “at up to 25x less cost and energy consumption than its predecessor.”
The company is also partnering with Schneider Electric to develop publicly available “data center reference designs” that will provide benchmarks for system performance and efficiency.
DOE is pushing for further improvements in data center energy efficiency. “In 2020, the average data center used only 37% of its energy for cooling and other needs other than powering the IT equipment,” the AI for Energy report says. “The most energy-efficient data centers in the world use only 2 to 3% of their energy for such purposes.”
DOE’s own Frontier supercomputer at Oak Ridge National Laboratory “uses advanced liquid cooling and other state-of-the-art techniques to achieve this 3% goal,” the report says.