Around the Corner: The Promise, Uncertainty and Unparalleled Risk of Data Center Load
Estimates vary widely for data center load growth, ranging between 5-14% of total U.S.-wide load by 2028
Estimates vary widely for data center load growth, ranging between 5-14% of total U.S.-wide load by 2028 | Aurora Energy Research
|
Columnist Peter Kelly-Detwiler writes about the explosion of data center load requests and the enormous risks to utilities and ratepayers of overbuilding assets.

Recent headlines and projections related to emerging data center load are astonishing. In February, Dominion Energy reported over 40 GW of data center contracts in its Virginia service territory as of December 2024, an increase of 88% from its July number. To put those numbers in perspective, Dominion’s record peak load in 2024 was just over 23 GW.

Meanwhile, that same month PPL Corp. stated it had received 54 GW of requests across its Pennsylvania and Kentucky service areas. PPL’s 2024 peak demand was 7 GW. Through the same period, Texas utility Oncor highlighted 228 transmission-level interconnection requests for 119 GW, almost four times larger than the 31 GW of demand it currently serves.

Numerous other utilities also are seeing significant numbers, with Exelon reporting data center load of 16 GW, and some single “hyperscaler” projects well over 1 GW. For example, Meta’s $10 billion hyperscale endeavor with Entergy in northeastern Louisiana is sized at 2 GW.

This activity is part of a global race to expand artificial intelligence capabilities while growing the underlying data center infrastructure. The investments clearly will be enormous, with profound implications for many utilities, especially those close to communications cables (it’s the confluence of numerous high-speed cables that makes Dominion’s northern Virginia region the data center capital of the world). However, it has become increasingly apparent that access to existing communications infrastructure is not as important as it once was. Today’s imperative is to access electricity as fast as possible, which means more utilities eventually will be affected.

The Overriding Mandate for Power

Leading chipmaker Nvidia’s CEO Jensen Huang highlighted the primacy of power in his March GTC keynote, stating:

Peter Kelly-Detwiler |

“Remember that one big idea is that every single data center in the future will be power limited. Your revenues are power limited. You could figure out what your revenues are going to be based on the power you have to work with. This is no different than many other industries. And so, we are now a power limited industry. Our revenues will associate with that.”

It’s all about accelerated access to the electron, so data companies are willing to go wherever electricity is available. That explains why Meta is working with Entergy to build three 750-MW gas generators in a remote and impoverished province in northeastern Louisiana. It’s also why Texas is a hot spot for new data load — the state has the land, and more importantly, it’s one of the easiest places in the country to develop new generating assets.

The Risks to Utilities and Ratepayers

After decades of relatively flat — or even negative — growth, many utilities understandably like what they see: enormous, high load factor demand from some of the most well-capitalized companies on the planet. At first blush, data load looks like a perfect antidote to stagnating utility revenues. However, this value proposition brings with it a significant level of risk. To understand where that risk lies, it helps to break this issue into discrete elements:

    • The Interconnection Requests and “Phantom Load” — The data industry power imperative is simple: Get access to energy as quickly as possible to maintain competitiveness. To get that power, large players may deal with utilities directly, or they may buy existing projects put together by other developers. In either case, they are incentivized to develop multiple applications across numerous locations.
      • If Project A wins, they withdraw Projects B and C. This approach is similar to the supply interconnection queue, in which fewer than 20% of projects initially entering the queue ultimately flow power. The fluid nature of the industry also results in constant changes. For example, in March, Microsoft withdrew 2 GW of projects in Europe and the U.S., and then in April, it pulled back from three Ohio projects worth $1 billion.
      • In addition to the big hyperscalers, numerous other players are active, including speculative developers looking to grab land, access power and flip their projects to third parties. The result is an inflation of the interconnection numbers that may be quite significant.
    • Contract Lengths and Temporal Mismatches — Recent contractual structures approved by utility commissions typically include a ramp period of four to five years, followed by a period of 12 to 15 years at full load. Contracts often are structured as take-or-pay agreements, meant to inoculate ratepayers during the length of the contract period, but only for the initial contract length. The problem is the contract durations align poorly with generation and transmission infrastructure with lifespans that often exceed 30 or 40 years. If data center loads were not so large, this risk would not be as considerable. Given their magnitude, if data center load shrinks or disappears, stranded asset risk could be quite considerable.
    • Competition & Consolidation — In the U.S. alone, more than a dozen entities have developed over 40 large language models that consume huge amounts of data and electricity. If the past battle for search engine supremacy or the lessons of general economic theory are anything to go by, we can expect many of these actors to fail or be consolidated in the future, creating attendant risk for both the utilities holding the supply contracts and their captive ratepayers.
    • Constantly Evolving Technologies — Data center technologies are highly dynamic and are becoming increasingly efficient. In cooling, which consumes roughly 35% of data center load, liquid and two-phase cooling promise to cut energy consumption dramatically, by as much as 90%. Meanwhile, performance of the cutting-edge chips from Nvidia demonstrates remarkable gains. The next-generation chip — to be delivered by 2027 — will yield performance gains of 900 times that of its chip introduced in 2022. Supported by AI itself, future chip efficiencies will improve.
    • Approaches to Training the Large Language Models — The traditional “brute force” approach to training AI models has been to combine powerful chips with huge amounts of electricity to crunch data — in some cases as much as a trillion parameters in a single training model. However, news out of China this spring suggests that in some instances there may be a better way that involves far few chips and significantly less energy. DeepSeek and Baidu’s Ernie X1 reportedly focused more on algorithms and software efficiency, so that they used fewer chips and far less energy. Neither has provided solid information with regard to their metrics, so verification is difficult, but there could be far better ways to achieve AI-related outcomes.
    • The biggest question related to efficiencies is simple: If the training models get less expensive, and the applications become more cost-effective, will society simply end up applying more artificial intelligence in more sectors of our economy? We thus would use less energy in our training models and more in “inference,” the application of the models to the real work in reasoning and making decisions. It’s simply too early to say.

The Challenge and Opportunity, and the Need for More Rigor

All of these issues point to today’s indisputable reality: The entire industry is morphing so quickly that nobody really knows what it will look like just a year or two from now. Given how rapidly the industry is growing, the hundreds of billions of dollars of investments that will take place just this year alone, and the rapid evolution of the models and underlying technologies, projecting the future is impossible. But we do know that big is big. The sheer magnitude of the potential investments required for both AI and general data center load suggests the opportunities for the utilities are unparalleled, even as the risks have rarely — if ever — been greater.

Utilities and grid operators are beginning to recognize these risks and approach some of these issues with more deliberation. In April, for example, ERCOT in its Long-Term Hourly Peak Demand and Energy Forecast highlighted 86 GW of data center load in 2031 as identified by Transmission Service Providers (TSPs). That number was based on both signed contracts and attestations from TSP executives. However, ERCOT significantly reduced its data center load forecast to 24,200 MW, “based on observation of behavior and characteristics of these loads, including average project delay, load profile by type and average project realization.” That’s still admittedly a very crude approach, but better than taking the numbers at face value.

PJM’s Independent Market Monitor recently commented on data loads and their potential impacts on markets, transmission and reliability, suggesting the grid operator should create a formal interconnection process — including milestones — similar to the one for supply. “Every new generator and every large load addition should go through this process,” the Market Monitor commented, adding, “There are no short cuts.”

Utilities also need to dramatically improve their interconnection processes. They need to better understand all aspects of this rapidly expanding and evolving industry — function, purpose, key value propositions, technologies and business models — and the attendant risks and opportunities for utilities and ratepayers.

The data and utility industries come from completely different cultures, technologies and ecosystems. They now suddenly are being thrust together to create what eventually will be a central nervous system that will affect the entire planet. As such, they need to do a lot more work to better understand each other, optimize their approaches and de-risk the outcomes.

Around the Corner columnist Peter Kelly-Detwiler of NorthBridge Energy Partners is an industry expert in the complex interaction between power markets and evolving technologies on both sides of the meter.

CommentaryCompany NewsMarketsResources

Leave a Reply

Your email address will not be published. Required fields are marked *