Emerald AI, InfraPartners Team up to Deploy Flexible Data Centers

Digital infrastructure firm InfraPartners and Emerald AI announced a new partnership to construct data centers designed to be flexible grid assets.

The Flex-Ready Data Centers combine Emerald’s energy management software solutions with InfraPartners’ off-site manufacturing approach to constructing and upgrading data centers, the companies announced March 10.

“The innovation here is to put together the data center design with the needs of the software from the beginning, so that the data center is delivered as a flex-ready data center, so there is no retrofitting later,” Emerald’s chief scientist, Ayse Coskun, said. “There are no additional components needed later.”

The software needs telemetry from all aspects of data centers, which includes computing, cooling, any behind-the-meter generation or storage, and other uses of electricity at the facility, she said.

The main attraction for data centers to become flexible grid assets is speed-to-power, but flexibility offers clear benefits to the operation of the grid, InfraPartners Chief Technology Officer Harqs Singh said.

The Electric Power Research Institute has “a data center flex program with all the utilities in it, and so they’re very interested in being able to have data centers become assets, rather than just consumers,” Singh said. (See EPRI Launches DCFlex Initiative to Help Integrate Data Centers on the Grid.)

With Emerald’s management services, data centers can respond to energy availability, match up with intermittent renewables or just respond to prices, Coskun said.

“So, this interface enables not just speed to power, but more broadly a more amicable relationship between the large data center loads and the grid,” she added.

Compared to “traditional” data centers — those used for cloud computing and data storage — AI data centers have a very high “power density,” which is why they have made headlines about massive loads ranging from the hundreds of megawatts to gigawatts.

“The power density of a rack — a cabinet of servers — is increasing like 10 times compared to a typical cloud rack,” Coskun said. “The AI data centers are running a mix of training, inference and other AI loads, and there are differences. For instance, training loads tend to be more spiky, changing the power up and down more rapidly compared to cloud loads.”

Cloud computing data centers must respond to consumer requests, such as when someone accesses a database or streams video, while AI data centers have more batch processing, long-running training and heavily use their computer hardware, she added. Using energy management techniques can help smooth out their highly variable demand.

“I consider this a welcome side effect of controlling power that the spikes are reduced,” Coskun said. “Because essentially, it’s not only necessarily just reducing the power during a high demand time, but also you can set up overall power limits to gently curb the power without adversely impacting performance, at least beyond the performance constraints, and then reduce these spikes.”

The grid does not respond well to major, fast fluctuations in demand or supply, so flexibility can make AI data centers much easier to handle on the bulk power system, she said. Energy management can also smooth out ramps from spiking energy during training and as they are responding to signals from the grid itself.

“In our work so far with power grid operators and utilities, we received both requests — ‘can you reduce the power over a gradual window of 10 to 15, minutes? We don’t want to see this sharp drop,’” Coskun said, and “‘can you respond within seconds in an emergency, if needed?’ And we demonstrated both. So, there’s flexibility on how quickly we can tune power as needed, depending on the needs of the grid.”

InfraPartners can build it all from the start with its approach of building data center infrastructure at a central manufacturing site and then deploying it where needed, Singh said. That can help with initial construction, but also as new chips constantly improve and existing chips wear out and need to be replaced.

“We are going to have to be a lot more agile,” Singh said. “We’re going to have to adapt a lot more.”

The biggest constraint the industry faces now is power supply, and one way of handling that will be to install more efficient chips as they become available, he said.

“That means that the data center needs to evolve to deploy the latest chips all the time,” Singh said, “and being really good grid partners, working with the grid, showcasing to them how are the loads changing. How do we manage our assets on the data center side with grid assets, such that we’re good partners and be able to power the performance improvements that are coming? … It’s what we call ‘the upgradeable data center’: having a data center that upgrades with different chip technologies that are coming.”

A lot of the contracts for chips last about five years, but how often the chips are going to be swapped out is somewhat uncertain at this point, he added.

PJM Plans to Release Reliability Backstop Design in April

VALLEY FORGE, Pa. — PJM has updated its thinking on the design of its reliability backstop procurement to meet rising data center load, gravitating toward a model in which the RTO would determine the amount of capacity to be purchased and act as the administrator and counterparty to the resulting agreements.

Rebecca Carroll, executive director of market design and economics, repeatedly stressed during a workshop March 4 that PJM does not have a proposal yet and will be working on its final design through at least April 10. The RTO aims to file a proposal with FERC by late May.

PJM is considering allowing data centers that procure capacity through the procurement to avoid being enrolled in its proposed Connect and Manage system, which would require them to curtail ahead of demand response resources during strained system conditions. While the amount of capacity purchased in the backstop would be determined by PJM, Carroll said the buyers may be able to submit their own preferred amount to purchase.

The procurement is intended to be a one-time measure that awards 15-year capacity commitments, possibly starting in 2030.

Core questions Carroll said PJM’s package must answer include how the RTO should balance reliability and over-procurement risks; whether changes to credit and collateral rules would be required to account for the greater risks associated with 15-year commitments; and when resources would need to be capable of coming online to participate in the backstop — with 2029 or 2030 being possible requirements.

A model in which PJM is the counterparty could pose substantial risks if either the data center or generation default, which could force the RTO to pick up the remainder of the multiyear commitment or to suddenly procure capacity for the data center. PJM presented examples of how securitization could be used to shift the risk to investors in a model similar to the bonds issued in the wake of February 2021’s Winter Storm Uri.

PJM Chief Risk Officer Carl Coscia said such a high collateral requirement would likely make any project unviable; however, the threshold should be high enough to prevent backstop participants from walking away from their commitments. The RTO’s risk provisions were designed around a three-year advance capacity auction awarding one-year commitments and are not well positioned to account for the uncertainty with 15-year obligations, he said.

Gwen Kelly, PJM senior director of credit risk and collateral management, said if the current credit policy was applied to a 15-year, 1-GW unforced capacity commitment at $400/MW-day, there would be a $662 million pre-auction credit requirement, $224 million of which would be returnable. Coscia said this accounts for deficiency charges over the 15 years.

PJM CFO Lisa Drauschak repeated that staff still do not have a proposal, and the presentation only illustrates possible pathways and outcomes.

Several stakeholders have presented their own perspectives and proposals during several workshop meetings held over the past month. (See PJM Stakeholders Begin Discussions on Reliability Backstop Design.) The workshops will be on hiatus over the next month until PJM has a complete package to present.

Many of the same sticking points dominated the discussions: how to define the amount of capacity to be procured; whether the procurement should be one-time; which resources are eligible to offer; and whether PJM, utilities or the data centers should be the counterparties to backstop commitments.

Independent Market Monitor Proposal

The Independent Market Monitor proposed a backstop procurement awarding 15-year commitments to new resources seeking to serve data centers in the same locational deliverability area (LDA).

The design would be based on the Base Residual Auction (BRA), modeling capacity transfer capability and limits between LDAs and providing a single clearing price up to a maximum based on the net cost of new entry for the reference resource. Unlike the BRA, the backstop maximum price would be based on an assumed 15-year lifespan for the reference resource to match the commitment term. The contracts would cover the full cost of energy, ancillary services and capacity.

The Monitor’s backstop would not be a one-time measure and would be run after each BRA to procure capacity for data center load, which would be excluded from the standard capacity auctions.

Seller eligibility would be limited to new generation, with no allowance for uprates, DR or resources that canceled deactivations or did not clear in the capacity market. Consumers could offer varying bids into the auction, with the highest winning if there is insufficient supply offered.

Data centers larger than 5 MW would be required to participate in the backstop or be subject to curtailment similarly to PJM’s Connect and Manage proposal. The RTO would work with electric distribution companies to identify the data center customers behind large load additions (LLAs) that the utilities submit for inclusion in the load forecast.

GQS New Energy Strategies Principal Pamela Quinlan, representing the Data Center Coalition, said it would be a difficult task to tie LLAs to specific customers, and allocating costs to a class of consumers based on how the electricity is used would be undue discrimination.

She argued the Monitor’s analysis assumes available capacity would remain the same in the absence of data center load growth, ignoring the likelihood of resources deactivating without that growth.

Quinlan said using a 15-year amortization period to set the maximum price, on the grounds that the commitment term should establish its useful life, is inconsistent with the Reliability Pricing Model, which uses a 20-year amortization period for a one-year commitment. Like the RPM, backstop resources could participate after the commitment has expired.

Data Center Coalition

Quinlan presented a set of priorities the Data Center Coalition believes should be incorporated into PJM’s design, centering around the position that the RTO should not make substantial changes to the capacity market while designing a one-time procurement structure.

The coalition recommended a backstop design in which PJM would be the counterparty and limited to participants which could be in service for the 2028/29 delivery year, with some allowance for the following year. The RTO’s design should not seek to determine resource adequacy for specific load-serving entities or use “uncertain” long-term forecasts to determine the need for capacity.

Concurrent with the procurement, the RTO should initiate a comprehensive review of the capacity market design, including improvements to load forecasting and consideration of “LSE-based frameworks,” Quinlan said.

Responding to questions on how the risk of a data center default could be managed, Quinlan said risk allocation is an important question to consider, but one that should be part of a long-term discussion. The ideal way to manage the risk associated with multiyear commitments is to ensure that the backstop is a short-term measure that buys time for more substantial market changes, she said.

Quinlan said the coalition considered ways of allocating costs that did not fall to LSEs, but there are practical questions on implementation and whether that can be accomplished in time for a May filing.

Google

Google recommended PJM adopt a backstop in which it procures capacity on behalf of load and allocates the costs across the region, leaving it to states to develop end-user rates. While the company shared several design components it prefers, it stated it does not have a complete proposal.

The company expressed support for one-time solution targeting a specific delivery year with well defined needs, leaving long-term capacity commitments as a separate issue. The backstop should focus on a fuel-neutral framework for incentivizing high-accreditation resources, with the capacity to be purchased defined by the deficiency in a particular auction — rather than targeting individual customers or a class of end-users.

Joint Stakeholders

A cohort of generation owners presented a backstop focused on meeting the shortfall expected for the 2028/29 BRA, scheduled to open in June 2026. The proposal was signed onto by Constellation Energy, Vistra, AlphaGen and Earthrise.

The one-time auction would be conducted in September and mirror the 2028/29 BRA clearing price for commitments up to 15 years. Resource offers would clear first based on the delivery year in which the project can come online, then by the length of the commitment term the offer requested. Procurement would be capped at the reliability requirement for the 2028/29 delivery year.

Seller eligibility includes new resources, uprates, DR, reactivated resources and existing resources that cleared above the maximum price in the 2028/29 auction.

Constellation’s Erik Heinle said the proposal is agnostic on how costs would be allocated, though it specifies that it would respect bilateral contracts. The risk of over-procurement and large loads not coming online would be managed by restricting procurement to load that is already accounted for in the capacity auction through capping the amount purchased at the reliability requirement for the 2028/29 delivery year.

Voltus

Voltus advocated for PJM allowing DR to participate in the backstop, arguing behind-the-meter resources have some of the quickest development times — making them well suited to a process intended to rapidly bring on new capacity.

Senior Manager of Regulatory Affairs Kimaya Abreu said PJM should be focused on procuring new capacity from resources not receiving a sufficient price signal from BRAs. That effort would be best served by taking a fuel-agnostic approach which allows DR to participate. Not allowing DR participation would run afoul of requirements that BTM resources be treated comparably to generation, outlined by FERC in orders 719 and 745.

Voltus argued including DR in the backstop is consistent with proposals stakeholders made throughout the 2025 Critical Issue Fast Path process focused on meeting large load growth, as well as the statement PJM’s Board of Managers released at the conclusion of the process. (See PJM Board of Managers Selects CIFP Proposal to Address Large Load Growth.)

The company also endorsed a proposal by the Natural Resources Defense Council to define new capacity, which would allow resources that have completed the third phase of the interconnection process, or are in the surplus interconnection service process, to qualify so long as they are not already subject to the capacity must-offer requirement. For DR, resources that did not offer into the capacity market between the 2025/26 and 2027/28 auctions would be permitted, as well as those seeking to increase the amount of capacity offered.

NRDC

The NRDC’s proposal included an auction design in which capacity would be procured for a pool of buyers that would share the costs and risks, while sellers would receive 10- to 15-year commitments. If participating consumers default or do not come into service, either the capacity payments would be reduced, or the remaining load would pay more. The auction would be a permanent addition to the capacity market, conducted during each queue cycle’s final agreement phase. For Transition Cycle 2, this would be December 2026 or the following month.

Participating resources would be required to offer into BRAs during their commitment terms, with the revenues flowing to load with long-term commitments, which would also be responsible for capacity deficiency penalties. The auction would be open to large loads as well as LSEs seeking to offer long-term firm service to new customers.

The maximum procurement would be set at the amount bid into the auction, and any load that does not receive a commitment would be required to go through PJM’s proposed Connect and Manage system.

Eligibility would be limited to projects that have already cleared the interconnection queue but not yet entered service, as well as DR. The NRDC said the backstop should not be allowed to become another expedited interconnection queue following the example of the Reliability Resource Initiative, which allowed 51 projects to be inserted into TC2. Several of those projects have dropped out of the queue after running into high network upgrade costs.

Questions Raised over Ratepayer Protection Pledge

DALLAS — Figures in the energy industry are casting doubt on the White House’s proposal to shield ratepayers from the costs of interconnecting large loads, saying it ignores the jurisdictional responsibility between regulatory authorities.

The Ratepayer Protection Pledge secured commitments from developers to pay for the full cost of power plants and any required delivery infrastructure upgrades, whether the data centers use the power or not. The pledge asks the data centers to strengthen the grid’s resilience by making their backup generation resources available during times of scarcity to prevent blackouts and power shortages in their communities.

Leaders of seven large Big Tech companies signed the nonbinding pledge during a March 4 ceremony in Washington. (See Trump Gets Tech Execs to Sign ‘Ratepayer Protection Pledge’.)

Rob Gramlich, president of the D.C.-based consulting firm Grid Strategies, said during the Federal Reserve Bank of Dallas’ Powering AI conference March 4-5 that there was a “deal to be had between the richest corporations the world has ever known” and the power sector and its end-use customers.

He found gathering regulators in the same room with the tech companies and getting the companies to agree on a “political level” to paying their “fair share” was “quite impressive.”

“That’s an important step,” said Gramlich, who served as an economic adviser to Pat Wood during the latter’s FERC chairmanship. “I know from my eight years being in a regulatory agency that having policymakers agree to, ‘Here’s the deal. Here’s kind of what we’re trying to achieve,’ and then go work out the details … that’s an important step.

“Those first two components are important and check two boxes,” he said. The third phase, implementation, is “really complicated … with a whole new set of complications,” Gramlich said, pointing to jurisdictional issues between the federal government and the states.

“Every market structure is different. Every state has a different arrangement of who’s responsible for transmission, generation, the planning and the cost allocation in ‘FERC land’ outside of Texas,” he said. “The retail-wholesale split is extremely complicated. FERC can’t right now just go and say, ‘Oh, data centers, you pay for this thing.’ Those are retail customers. FERC can’t tell what one retail customer versus another retail customer can do without the state saying that’s the way.”

Gramlich said FERC could assert jurisdiction over the states and go through seven years of litigation. “But there’s not seven years to go through that process,” he warned.

Andrew Schaap, CEO of developer Aligned Data Centers, said he is a firm believer in the necessity of the U.S. winning the artificial intelligence race.

“A lot of our adversaries are not having [these discussions]. They’re just doing it. They’re just going to go as fast as possible,” he said, noting China is building 1.5 TW of solar power a year.

“The fair rate pledge is a way to give latitude to operators like ourselves and hyperscalers to do behind-the-meter generation. I build my own power plant; I build my own systems,” Schaap said. “Is that the most efficient way to do it? Probably not. The most efficient way is to do it with the grid.”

Schaap did allow that the pledge is a “good incentive to get there faster.”

“One of the things that we’re all struggling with is there’s just not enough capacity fast enough,” he said.

Nick Elliot, who recently left the Department of Energy’s Grid Deployment Office to join the White House’s National Energy Dominance Council as a senior policy adviser, was in the room where the pledge was signed before he took a red-eye flight through thunderstorms to Dallas.

The transmission piece of the pledge was the hardest component “to get right,” he said holding a cup of coffee. The large load will have to cover 100% of the direct cost for a tie line, he explained, but connecting large generation to the facility is going to affect the entire system’s upgrade requirements.

“They will benefit you, but they’re going to benefit everyone else,” Elliot said. The hyperscalers are “willing to engage” in innovative regulatory structures, paying “full freight” or over time.

“We’ll build the highway,” Elliot said, speaking for the large loads. “And to the extent you end up building a whole bunch of other hotels on the highway, allocated to those other people later, we will backstop, and we’ll take the risk.”

“I certainly understand why Nick says that transmission is harder, because it’s harder,” said Stu Bresler, PJM’s executive vice president of market services. “The benefits of transmission do flow to many customers once it’s built. I think the challenges with allocating the cost of new generation on the system are equally difficult to transmission when there’s so much uncertainty about how much load you’re actually building for and who the customers will actually be. We have to get through all these cost allocation issues. They’re extremely foreign.”

Texas Addresses Rising Costs

Google was one of the seven companies that signed the pledge. Doug Lewin, who recently left his consultancy to join the company as the Texas lead for energy market development, made it clear that Google wants to be connected to the grid.

“You have several advantages to that, both from the data center side and from the public side,” he said. “We just have to have a historical perspective here and remember that for the entire life of the grid, over 100 years back to the earliest days, system use matters.

“It’s a simple division problem, right? Whatever your fixed costs are, can you spread that across as many users as possible that lowers the unit cost?” Lewin added. “That’s the basic economics of the grid as it has existed since the 1910s, and that principle still holds. So, we think it’s not only good for us to be connected, and this would go for any data center, but also for all customers.”

Google and Lancium, an energy technology and infrastructure firm, have filed joint comments on the Texas Public Utility Commission’s proposal to set interconnection standards for large loads (58481). They argued the proposal requires “large, upfront and nonrefundable financial commitments without providing clear study outcomes, defined interconnection timelines or a predictable path to energization.”

“This sequencing shifts significant risk onto customers before system feasibility and deliverability are known,” the companies said, referencing a flat $100,000/MW nonrefundable interconnection fee they said may result in overcollection beyond true costs.

ratepayer pledge

Google’s Doug Lewin and Emerald AI’s Arushi Sharma Frank share a laugh during their panel discussion. | © RTO Insider 

Instead, they have suggested a five-year, 50% minimum demand charge to fund infrastructure builds and share in costs. Lewin said that is a “very tangible way” large loads can shift around costs based on ERCOT’s Four Coincident Peak (4CP) program. Under the program, industrial customers are charged a fee for 4CP based on the amount of electricity consumed during a defined period in the previous year when demand on the grid was at its highest.

“Large loads can get away from paying a transmission charge,” Lewin said. “We have come forward with other partners and said, ‘We want a minimum transmission charge.’”

Emerald AI’s Arushi Sharma Frank, Lewin’s partner on the panel, applauded the Google-Lancium proposal.

“As the load comes in, it pays for the transmission upgrades, and if the load comes before that, great,” she said. “But if upgrades come first, then the loads need to still be there to foot the bill because they are going to eventually use it.”

ERCOT General Counsel Chad Seely said these discussions are part of policy issues being discussed in Texas.

“We’re trying to figure out what the best process is to study [large loads] reliably and make sure that we’re making the best decisions as far as building out the transmission infrastructure and making sure that they have enough skin in the game,” he said. “This is really a pivotal year for ERCOT and our stakeholders to kind of put forward these policy frameworks that will have long-lasting implications as we move forward to manage this tremendous amount [of load].”

Grid Strategies Calls NERC LTRA Too Pessimistic

NERC’s 2025 Long-Term Reliability Assessment paints too gloomy a picture of the electric grid’s risks over the next 10 years, according to a report from Grid Strategies that purports to present “a more comprehensive picture of adequacy risks … given different … underlying assumptions.”

The LTRA, released Jan. 29, warned that the resource adequacy outlook of the North American grid was “worsening,” with 13 of 23 assessment areas potentially developing into high or elevated risk between 2026 and 2030.

High risk means planned resources as of July 2025 would lead to energy shortfalls more than RA targets or baseline criteria for unserved energy or loss of load, while elevated risk areas meet RA targets but are likely to experience energy shortfalls in extreme weather conditions. (See NERC Warns of ‘Worsening’ Resource Adequacy Through 2035.)

MISO, PJM, Texas RE-ERCOT, WECC-Basin and WECC-Northwest fell into the former category, and the latter included MRO-Manitoba, MRO-SaskPower, MRO-SPP, NPCC-Maritimes, NPCC-New England, NPCC-New York, NPCC-Quebec and SERC-East.

NERC’s risk projections for most areas were driven by concerns about “demand growth … outpacing planned resource additions.” The ERO observed that new large loads such as data centers and industrial centers were projected to come online in nearly every assessment area, with the electrification of transportation and the spread of heat pumps further raising demand.

At the same time, many of the resources planned to replace existing thermal generation facilities are weather-dependent assets like wind and solar plants, or natural gas generation that will require investment in gas infrastructure to ensure the availability of fuel.

The LTRA did not arrive without skepticism: Members of the Organization of MISO States objected to the ERO labeling MISO as a high-risk area. State regulators said NERC should have counted resources in MISO’s fast-track interconnection queue among planned resources, which would have neutralized the 7-GW shortfall NERC expects to materialize by 2028. (See MISO States Dispute ‘High Risk’ Designation from NERC.)

Grid Strategies’ report similarly described the LTRA as “too pessimistic” because of overly strict assumptions about generation resource additions. The authors also claim NERC underestimated regions’ ability to import power and may have exaggerated data centers’ potential demand, further tightening reserve margins. However, they acknowledged the LTRA “highlights important reliability concerns” grid planners will need to address.

Some Tier 2 Exclusions Unwarranted

The authors’ first criticism of the LTRA concerned NERC’s practice of considering generators only under construction or with a signed interconnection agreement as “Tier 1” resources that can be counted as planned capacity additions. Tier 2 resources are those that are in earlier stages of development, such as an interconnection planning study, which NERC considers to “have more uncertainty in being realized.”

This understanding of Tier 1 resources is too limited, Grid Strategies argued, drawing on data from Lawrence Berkley National Laboratory to show that more projects exist with signed IAs than NERC acknowledged. The authors suggested the “mismatch in data could be due to different processing methods, collection periods and … resource accreditation methods.”

Using the LBNL data, the authors claimed a potential shortfall NERC missed in the SERC-Central subregion, and two near-misses in WECC-Basin and MRO-SPP, could be resolved, though the projected shortfalls in MISO and PJM still would remain and another near-miss could develop in New York.

The authors also questioned the decision to exclude all Tier 2 resources from the LTRA. They acknowledged these additions might not complete the interconnection process but suggested treating them as nonexistent for planning purposes was too extreme. Applying what they called “historic region-specific and phase-specific withdrawal rates to the LBNL data,” they suggested that already-planned Tier 1 and Tier 2 resources could be enough to avert shortfalls for all regions except PJM.

The biggest concern about resource additions is “delays in the study, permitting and/or construction phases” that push resources already in the queues past their projected start dates, including planned wind and solar resources. Adjusting the previous projection to account for likely withdrawals introduced shortfalls in WECC-Basin, SERC-Central, MISO and PJM, the authors wrote. They recommended permitting reforms and more efficient interconnection queues for all resources to reduce this chance.

Imports and Demand

Energy imports also have a role to play in addressing energy shortfalls, but the LTRA did not fully include this option either, Grid Strategies observed. NERC counted only firm interregional capacity transfers, but considering “non-firm imports over interregional transmission lines that have been available during historical grid stress events” could alleviate additional strain, the authors wrote.

Going further, the authors suggested that implementing interregional transmission additions proposed in NERC’s Interregional Transfer Capability Study could provide “even more security to avoid shortfalls.” While they acknowledged this would be a harder lift, they also claimed that “legislative and regulatory action taken now could accelerate the construction of these prudent additions.”

Finally, Grid Strategies asserted that utility projections of load growth on which NERC relied to create the LTRA “are based on overstated assumptions.” Data centers, which were a significant part of many utilities’ projections, were a key example in Grid Strategies’ report; citing data from TD Cowen, the authors suggested that limits in the supply of key computer chips could restrict data center growth in the U.S. by as much as one-third.

In addition, the authors observed that data centers could manage their demand to reduce their need for on-peak grid power, such as by using local generation and storage resources, or by curtailing demand when energy prices are high. Such activities “are unlikely to appear in utility forecasts of expected load,” the authors claimed.

To reduce uncertainty, Grid Strategies suggested updating data center load forecasts to take the chip industry and other outside data sources into consideration. Reducing data center load might eliminate all shortfalls in the LTRA, even without factoring in the generation adjustments, the authors wrote.

Searchlight Report Calls for Infrastructure Fund for Data Center Development

The Searchlight Institute released a report arguing that the data center buildout should be taken advantage of to pay for the expansion of the grid.

The think tank was established in 2025 by a group of Democrats who want to come up with policies most Americans support, and it dives into the growth of data centers as their impact becomes a top issue in politics. (See EPSA Summit Held with ISO/RTOs in the Middle of the Political Debate.)

“Seizing the Data Center Buildout for Grid Modernization” is written by Searchlight Senior Fellow Jane Flegal and was released March 9. It notes that the grid is aging and the clean, firm capacity needed for a reliable system is nowhere near built.

“Meeting national goals, from powering economic growth to enhancing our industrial competitiveness to advancing our national security, requires building a dramatically larger, more capable electricity system,” the report said. “Fixing this problem was always going to be expensive and politically difficult.”

The U.S. is competing with China on artificial intelligence, and a key constraint is the grid’s ability to serve the data centers needed to train and deploy AI.

“Data center demand could be an opportunity to fix the underlying problem,” the report said. “Data center operators want fast access to reliable power, certainty and fair treatment from policymakers. Policymakers and grid advocates can benefit from what those data centers can provide: capital, load growth that justifies long-needed grid investments and tax revenue.”

There is a narrow window through which policymakers must steer the grid buildout for optimal data center development. The report warns that window will close soon.

“The response to data center demand thus far has been ad hoc and inadequate,” the report said. “The structural failures underlying this dynamic, from interregional planning deadlock to permitting barriers to fights over cost allocation, require major policy change.”

The report suggests setting up an “American Grid Infrastructure Fund” to ensure spending associated with data center growth also enables grid modernization and maximizes benefits such as increased local tax revenues and construction employment.

“Participation agreements that require true cost causation commitments would deliver ratepayer cost savings that no voluntary commitment currently produces,” the report said. “An insurance pool backstopping stranded cost risk would unlock proactive transmission investment that can’t get built today without exposing ratepayers to downside. Procurement aggregation would convert hyperscaler equipment purchasing into a domestic manufacturing demand signal that no company negotiating alone can generate.”

Such a fund could be set up to be voluntary at first, but the report calls for a new federal law eventually.

“A fund can convert data center capital and political weight into an organized force for grid reform,” the report said. “Even if a fund failed to solve the political economy problem, it would generate more public benefit than the current, ad hoc approach.”

The fund would offer data centers cheaper financing, a standard participation agreement to accelerate interconnection, procurement aggregation to address grid bottlenecks and access to clean firm power at scale.

“The fund would not solve all of the grid’s problems on its own, but it could serve as part of a framework in which regulatory reform at the federal level, financing through the fund, and incentives for state action reinforce each other,” the report said. “The fund’s participation agreements, governance architecture and deployment strategy would aim to maximize the public benefit of data center demand growth while helping developers secure the certainty and speed they require.”

‘With the Skill to Survive,’ SPP Faces ‘Massive Challenges’

DALLAS — SPP CEO Lanny Nickell took to the stage to Survivor’s “Eye of the Tiger” as he opened the grid operator’s Energy Synergy Summit.

“The ‘Eye of the Tiger? That’s what you chose?’” he asked the event’s organizers as the music faded into the background.

The 1982 rock anthem highlights perseverance, determination and regaining one’s competitive edge, traits that will come in handy for the “massive challenges that are ahead of us.”

“Massive change, massive challenges, massive opportunities,” Nickell said in kicking off the March 2-3 event.

He harkened back to last year’s summit, SPP’s first, when the conversation centered on resource adequacy, increasing extreme weather events and other challenges. The days of excess capacity and unlikely load sheds were numbered.

“Now, we are scrambling, doing everything we can just to maintain a one-day-in-10-year probability of having an event,” Nickell said. “We were having tremendous load growth. Even that’s changed over a year.”

He said SPP was projecting 50% load growth during last year’s summit, but that has increased to 100% over the next 10 years. The RTO has responded, Nickell said, listing the Expedited Resource Adequacy Study process and “industry leading” High-impact Large Load (HILL) study process, both approved by FERC in the past year.

“If you are willing to bring generation with you, either co-located or no more than two buses away, you can get that generator interconnection studied along with the high-impact large load in 90 days or less,” he said. “That’s fantastic speed.”

SPP has also proposed a conditional HILL process for interruptible loads that want to interconnect quickly and a Consolidated Planning Process (CPP) that gives generators more certainty of their interconnection costs and yields affordable solutions through the traditional planning process. It expects commission approval of both in the next few weeks.

“That’s a win, but we’ve got a lot of other things that we want to work on,” Nickell said. “What’s next for us? What’s our next project?”

Whatever the next projects are, Nickell said they will require the same creative, outside-the-box thinking that produced the HILL study in 85 days, from start to finish and through the stakeholder process. They will also require collaboration with and support from members, regulators and market participants.

“We need you to work with us to figure out what it is that’s most important, what it is that we need to solve right now. If you can bring your ideas to the table, I’m convinced that we will come up with the best solutions,” he said. “We have to work together to economically and reliably keep the lights on. That includes solving problems together. SPP [and] staff can’t solve these problems alone. We need your help. That’s why you’re here.”

LaCerte: 765-kV Backbone Necessary

Two days before his nomination for a full five-year term advanced in the U.S. Senate, FERC Commissioner David LaCerte said in a fireside chat with Nickell that the industry’s long-term planning still needs to improve. (See related story, FERC’s LaCerte Clears Committee Vote on Nomination for a Full Term.)

FERC Commissioner David LaCerte | © RTO Insider 

“What that long-term planning looks like now is very much different from 2024 long-term planning,” he said. “It’s difficult. It’s tough because you want to project, but those projections have such a large standard deviation that it’s almost impossible to get it right.”

Picking up on SPP’s approval of four 765-kV transmission projects in its 2025 transmission plan, LaCerte said any future transmission plans should include extra-high-voltage facilities.

“We can’t live without 765s or you’re going to be an invertebrate, right? You don’t want to live your life as an invertebrate. You want to have a backbone,” LaCerte said. “It’s really important that we do these things properly because they have the potential to drive up costs on the consumers even more than they” already are.

He said a “big plus in [his] book” was having the White House come to the table with a bipartisan group of governors and PJM to propose a reliability backstop procurement for the RTO’s capacity auction and begin identifying universal parameters to protect customers from rate increases related to large loads and data centers. President Donald Trump also gathered the leaders of seven large tech firms March 4 to sign a “ratepayer protection pledge.” (See related story, Trump Gets Tech Execs to Sign ‘Ratepayer Protection Pledge’.)

“I think that was a great first step because it brought all those people to the table … together to talk about the problems and identify what [is] acceptable, what’s not acceptable and then just identifying the costs,” LaCerte said. “Even at FERC in our building, we even struggle with identifying which costs we are catching in these tariffs and which costs are we not. … If it’s a struggle for the career FERC staff, it’s a struggle for everyone because these are issues which are novel. We are moving so quickly that it’s imperative that we catch as many of those costs as possible so that there’s not a bunch of hidden costs that are passed along to consumers.”

Shielding Consumers from Costs

Members of a panel discussing pricing reform in these high-growth times agreed those costs need to be transparent.

“The public is now, especially in the post-inflation environment, very conscientious of cost, and I think SPP is rightly [placing] affordability as sort of a central tenet,” said Chris Matos, Google’s energy market development strategic negotiator. “The question is more on the commitment side, and with these load forecasts and the infrastructure expectations, if you plan correctly, costs can go down.”

Chris Matos, Google | © RTO Insider 

He said ERCOT’s 765-kV plan, if the expected load materializes, will reduce system transmission costs because essentially, “We’re leveraging a greater scale of megawatt-miles of transmission.”

A bill introduced in the Ohio legislature would require large data centers to enter contracts with utilities detailing their minimum billing demand, long-term service agreements, the exit fees or liquidated damages for canceled projects, and potential collateral or guarantees before any construction. It would also ban utilities from recovering costs incurred by data centers and shifting them onto customers.

“Google’s answer to this has been in the form of the capacity commitment framework that we’ve instituted in Ohio,” Matos said, “where we’ve agreed to minimum terms and minimum charges that ensure there is equity for the existing system and [customers] are not left constrained in the cost of infrastructure.”

Mark Ahlstrom, vice president of renewable energy policy for NextEra Energy Resources, said the company’s approach is to partner with the developers on multi-gigawatt sites that have the land, infrastructure and accessibility to power.

“We think it has to be a close partnership between large infrastructure investors like NextEra and the hyperscalers to put together something like that and make sure it all works within the community under the right tariffs, working hand in hand with the utilities and co-ops and so forth,” he said. “You have to develop certainty that that project is not going to just go away; that we have the commitments, we have the contracts, and we would find a purpose for that.”

BTM Gen ‘Suboptimal’

Longtime regulator Andrew French, chair of the Kansas Corporation Commission, shared a topic that he said has been top of mind in recent weeks: the growing concern about underinvestment in the transmission system.

“And yes, it can have a bill impact,” he said. “If we don’t move fast enough to make the grid ready or have processes to allow load to get on, folks will talk about doing things like behind-the-meter generation or just totally going off-grid. In my mind, that is a very suboptimal use of capital. It’s something that the customers are going to pursue just because they’re looking for the speed.”

KCC Chair Andrew French (left) shares his concerns as ITC Great Plains President Patrick Woods listens. | © RTO Insider

French said he has heard recent discussion of a “ghost grid” being developed with BTM generation and microgrids.

“That really concerns me that you’re going to have this sort of shadow set … of resources that’s probably not sitting in optimal locations, but it was just pursued for expedience,” he added. “It’s not what we want. It’s another reason why I think we need to move quickly. We need to provide pathways. I think there probably are a lot of these loads that would make sense to integrate into the wider grid. Let them find resources that can contribute to the wider grid.”

PPL CEO Vince Sorgi echoed French as he offered his thoughts and said he doesn’t mind the BTM approach “for a period of time.”

“If a grid is not ready and a hyperscaler can contract with a generator to build generation and serve that data center until that grid is ready, have at it,” he said. “But when the grid is ready, you should connect all generation to the grid for a number of reasons, right? One, the hyperscalers don’t want behind-the-meter generation. Two, just having that generation connected to the grid makes the grid more reliable and more resilient. It will ultimately benefit all customers.

“If we just built a bunch of behind-the meter generation, it would be the most suboptimized solution to this problem that we could have come up with,” Sorgi added.

NYISO Yields to Stakeholder Requests on Transmission Planning Changes

RENSSELAER, N.Y — After receiving pan-sector feedback from stakeholders asking for more time to review NYISO’s proposed changes to the reliability planning process, the ISO told the Transmission Planning Advisory Subcommittee it would delay proposing tariff language.

“The last number of meetings have been productive. They have been long. They’ve been painful at times,” Zach Smith, NYISO vice president of system and resource planning, said March 3. “I believe given the volume of feedback we’ve gotten from stakeholders, it’s appropriate that we spend more time with [the Electric System Planning Working Group] to talk through some of the proposals.”

NYISO’s plan had been to introduce tariff language in March. At previous meetings, stakeholders balked at the breadth of changes and their potential system impacts. (See Stakeholders Ask for Boundaries on NYISO’s Reformed Reliability Process.) ISO staff said they will spend additional time in March and April hashing out the changes with stakeholders.

“I want to make sure that we get this right and try to come up with the best process possible,” Smith said. Even with the extra month of deliberation, the ISO should be able to get the new process in place before the next Reliability Needs Assessment, he said.

Environmental, transmission and generation interests were appreciative of the extra time. Several sectors stressed that they were concerned with how NYISO would develop planning scenarios and digest stakeholder feedback.

Tony Abate, representing the New York Power Authority, told stakeholders that transmission planners have been “super involved” with working with NYISO to figure out how best to help with the changes.

“Transmission owners have only been so-so vocal compared with other sectors,” Abate said. “I want other stakeholders to know that’s because we’ve been having intense conversations amongst transmission planners at each of the TOs to consider what we think is best, and how TOs can enhance participation.”

Anie Philip, senior director of planning for PSEG Long Island, commented that NYISO had to develop a process that included the Long Island Power Authority at every step. PSEG Long Island manages LIPA’s distribution system on its.

“LIPA has a direct statutory obligation as a legislatively charted public utility to plan for and maintain reliability within its transmission district,” Philip said. “LIPA must have a full and meaningful role in all aspects of the Reliability Needs Assessment as it applies to LIPA’s district.”

The conversation shifted to discussing the actual proposed changes. This includes the development of multiple forecast scenarios for reliability planning, rather than one base case. If a reliability issue is found in multiple scenarios, the ISO would start the solicitation process for a solution.

Howard Fromer of Bayonne Energy Center asked whether NYISO had considered what it would do if New York state did not agree with a reliability finding under the new process.

“I think it would be counterproductive to have us have a process where you go through with this and the state disagrees with the foundational assumptions that drove you to reach your conclusion,” Fromer said. He recommended that the ISO develop a formal process to work with state agencies to get buy-in upfront.

Adam Evans, chief of wholesale and clean energy markets at the New York Department of Public Service, said he shared Fromer’s concerns and that he would be happy to work with NYISO to come up with a process that “ensures there’s more potential alignment.”

MISO, SPP Draft New Joint Portfolio that Could Run $3.6B

MISO and SPP put forth two potential 500-kV joint transmission portfolios valued at $1.3 billion and $3.6 billion to beef up their transfer capability.

The grid operators dubbed the two transmission options the “Core Combination” and “Core + Combination.” The more expensive, “plus” version features two additional 500-kV segments to connect neighboring transmission facilities.

MISO and SPP debuted a first look at the potential projects along northwest Louisiana, western Arkansas and east-central Oklahoma during an interregional planning meeting March 6.

The Core + Combination’s five 500-kV segments would:

    • increase import capability by an average of 3,427 MW in MISO and 1,102 MW in SPP;
    • resolve 94 thermal reliability violations in MISO, 75 in SPP and 32 across the tie lines; and
    • offer nearly $300,000 in annual economic congestion relief in MISO, $1.5 billion in SPP and $336,000 across tie lines in a 2034 case.

The Core Combination’s three 500-kV segments, on the other hand, would:

    • increase import capability by an average of 2,578 MW in MISO and 1,529 MW in SPP;
    • resolve 53 thermal reliability issues in MISO, 89 in SPP and six across the tie lines; and
    • extend nearly $83,000 in economic congestion relief in MISO, $895,000 in SPP and nearly $304,000 across tie lines in 2034 alone.

Ashleigh Moore, of MISO’s interregional planning division, said the larger upfront investment from the Core + Combination would establish a “broader” transmission solution set that would address more reliability and economic issues immediately, while the Core Combination would create “foundational upgrades” with the flexibility to add on more projects later.

Moore said the RTOs would use stakeholder feedback to decide which configuration to pursue and how to refine it.

MISO planner Jon George said the portfolio suggestions home in on the “hottest spots in the footprint for load expansion.”

The RTOs have not conducted a benefit-cost analysis on either option.

Benefits Pending

Missouri Public Service Commission economist Adam McKinnie asked if the RTOs have settled on what benefit metrics they would use to justify investment in the lines.

George said those are “not completely” worked out.

Southern Renewable Energy Association Transmission Director Andy Kowalczyk asked if the RTOs would use the seven transmission benefits established in FERC Order 1920 to gauge project usefulness.

“We have some more thinking to do on that,” George said. He added that though “both regions are headed” toward adopting Order 1920 benefit metrics, MISO and SPP are for now focused on “what are the different merits of the indicators we have from the screening” and ascertaining load growth estimates.

George said the RTOs don’t want to “get hung up forever on new business case methodology if we already have a pathway.” He said they can, according to current rules, already consider adjusted production costs and reliability and public policy benefits. He said the draft projects “promise to hit on a few of those and do so impressively.”

Advanced Power Alliance’s Steve Gaw said a broader set of benefits “that support interregional transmission that we know is needed” are critically important. He said MISO and SPP’s inability to devise big-ticket, regionally cost-shared transmission projects illustrates the importance.

“I’d hate to call them failures,” Gaw said of past Coordinated System Plan studies. “I hope we can weave [benefits] in so we’re not continually deciding which to include.”

MISO and SPP’s coordinated study process has never produced a workable interregional project.

The Alliance for Affordable Energy’s Yvonne Cappel-Vickery asked MISO and SPP to consider the more expensive plus portfolio route to achieve the biggest benefits to ratepayers. But she also asked the RTOs to present business cases for both options so stakeholders don’t “miss out on hearing the full breadth” of transmission benefits.

Bill Booth, a consultant to the Mississippi PSC, said the RTOs should develop a minimum benefit threshold soon and demonstrate that projects will actually deliver savings if retail ratepayers are to pay for them over the next 40 years.

SPP engineer Spencer Magby said the RTOs combed through 46 stakeholder-originated ideas and an additional 24 alternative solutions after they opened a second proposal window. (See 30+ Projects Under Consideration in MISO-SPP Joint Tx Effort.) Magby said they focused on three key corridors and conducted three rounds of studies to come up with draft recommendations.

WEC Energy Group’s Chris Plante said he’d like to see MISO and SPP’s recent level of planning and coordination applied to the MISO-PJM seam.

MISO and SPP initiated the joint study in 2024. While it evaluates its two project options, the RTOs are launching another CSP study to take place over the remainder of 2026 and beyond. They are required by their joint operating agreement to perform a CSP study at least every two years.

Some stakeholders suggested MISO and SPP use the upcoming joint study to go a step further than the 500-kV connections and consider linking up their planned 765-kV backbone systems.

2025/26 Most Expensive Winter in History of ISO-NE Markets

The winter of 2025/26 was the most expensive winter in the history of ISO-NE’s wholesale markets, driven by the lowest average temperatures in 20 years.

Energy market values totaled about $6 billion in December, January and February, more than twice the total value of the past two winters combined, according to ISO-NE data. Energy costs hit monthly records in both December and January, and the RTO experienced its second-highest energy market costs for February.

Total winter energy use reached its highest level since 2014, while the winter peak load hit its highest level since 2018.

The RTO’s announcement of record winter prices comes amid significant uncertainty about potential future price spikes triggered by the war on Iran.

Asked at the NEPOOL Participants Committee on March 5 about potential impacts of the war, ISO-NE CEO Vamsi Chadalavada said, “The markets are not expecting there to be a big disruption to the New England markets over the next 18 months, but that could change as events unfold.”

The war has spurred global concerns about oil and natural gas prices, as about a fifth of all LNG is shipped through the Strait of Hormuz, largely to meet demand in Asia.

In New England, wholesale electricity prices are highly correlated with gas prices, and much of the Massachusetts gas system relies on LNG imports.

In 2022, Russia’s invasion of Ukraine was a “large factor” in a major spike in New England gas prices, according to the ISO-NE Internal Market Monitor. Annual average natural gas prices in the region more than doubled in 2022 relative to the prior year, the IMM reported.

Regarding cybersecurity, Chadalavada said ISO-NE has seen a “sharp uptick in attempts to penetrate infrastructure” since the war began. He said the RTO has not been able to pinpoint from where these cyber threats are originating.

He said ISO-NE is working to be “as vigilant as we can” and hopes that “all the preparation that we’ve done is sufficient.”

Also at the meeting, Stephen George, vice president of system and market operations at ISO-NE, provided additional details on the extended cold snap the region faced in late January and early February.

Between Jan. 23 and Feb. 10, temperatures in New England averaged 11.3 degrees Fahrenheit below normal, he said. Over this period, gas generation accounted for 34% of energy, followed by oil at 22%, nuclear at 19%, net imports at 13%, renewables at 8% and hydro at 4%.

To allow generators to operate at their maximum capabilities, ISO-NE obtained a waiver from the Department of Energy enabling specified units to exceed emissions limits. Twenty-six units reported exceeding limits while the waiver was in place, he said.

The elevated reliance on oil-fired generation was driven by record gas prices during this period, causing dual-fuel units to switch to burning oil.

He noted that the region’s generators burned about 111 million gallons of fuel oil during the cold stretch, more than the total consumption for any entire winter since ISO-NE started tracking in the winter of 2015/16. This caused significant depletion of the fuel oil inventories, which dropped to about 20% of total regional storage capacity, the lowest recorded level by the RTO.

Inventory levels have risen following the event and are on track to rebound to pre-winter levels by mid-March, he said.

Wind power accounted for 54% of renewable generation, while solar accounted for just 5% of renewable production. Solar was significantly inhibited by snowfall and sustained cold weather during this period.

ISO-NE has noted that behind-the-meter solar in the region produced just 41% of its forecast potential during this period because of the impact of snow cover.

Kentucky Lawmakers: PSC Makeover Necessary to Bring Down Rates

Kentucky lawmakers are working to overhaul the state’s Public Service Commission in what they say is an effort to combat rising rates, while Gov. Andy Beshear (D) has characterized it as political maneuvering.

The Kentucky Senate on March 6 passed Senate Bill 8, which would add two more members to the three-member PSC and impose professional qualifications on appointments, in a 30-5 vote.

Republican lawmakers say the bill would help address high utility bills by pulling in more experienced candidates.

The bill would also add new rules to the appointment process, where two members of the commission would be appointed by the state’s auditor of public accounts, currently Allison Ball (R). As it stands, Beshear holds authority to appoint all three commissioners, subject to Senate confirmation.

Sen. Brandon Smith (R), a sponsor of the bill, told local news outlets that he believes some past commissioners were appointed as political favors, with “very few” having any experience in the energy sector.

“I think we could all agree that a lot of people got parked over there,” Smith said.

But Beshear said the potential reshaping of the PSC is a partisan attempt at a power grab.

“They never did that while there was a Republican governor. … They’ve done these shenanigans for six straight years,” Beshear said during a press conference March 5. “I’ve never seen them try to move something from a Republican officeholder to a Democratic officeholder, but I’ve seen them try to move a whole lot in the other direction.”

Beshear added that the state auditor’s office has no history of working with the PSC.

At the end of February, the PSC granted a rate increase for American Electric Power’s Kentucky Power, raising electricity rates 5.87% in 2026 and increasing to 6.63% in 2027, over Attorney General Russell Coleman’s (R) objection. While the hike was less than Kentucky Power’s requested increase of 14.6%, residents said it was excessive because they already struggle to pay utility bills.

Prior to the vote, the bill shed some unpopular provisions that would have effectively expelled consumer and environmental advocates from arguing against rate increases or maintaining a thermal generation status quo.

A draft version of the bill stipulated that the office of the attorney general would be “the sole advocate for residential consumers” in cases in which it intervenes. It also would have disallowed individuals from intervening in a case “unless the person can demonstrate, by clear and convincing evidence, that the person has a special and unique interest in the specific rates or service of the utility that are at issue in the case.”

Smith said the intent was to keep out-of-state groups funded by special interests from delaying projects.

Now the bill specifies that individuals who intervene must disclose their interest in the case and attest they are not doing so for the sole purpose of delaying projects. The PSC would be able to restrict or remove parties that cause disruption or delays to proceedings.

The Sierra Club called the draft of the bill “dangerous” and said it would have “kneecapped” organizations’ efforts to “defend local people from increasingly high energy bills and corporate interests.” The nonprofit said the attorney general intervenes in nearly every case.

“Intervention by Sierra Club’s legal team has successfully mitigated bill increases for millions of Kentucky ratepayers and recently secured a tariff that guarantees data centers pay their fair share of costs and have the opportunity to secure clean energy that may help draw businesses to the state,” Sierra Club said in a statement.

In neighboring Indiana, state regulators have opened an inquiry into climbing energy bills and summoned its top five utilities to provide answers. (See Indiana Commission Opens Affordability Inquiry into Utilities.)

AES Indiana said it canceled community open houses that would have helped explain high utility bills because of violent threats it received on social media. The open houses would have occurred March 3, 10 and 11 around Indianapolis. AES announced in early March that it would be acquired by an investor group including BlackRock, Swedish private equity firm EQT AB, California Public Employees’ Retirement System ⁠and ​the Qatar Investment Authority. (See BlackRock and Others to Take AES Corp. Private for $33B.)