CISA: Iranian Hackers Targeting U.S. Energy Sector

The attack on Iran by the United States and Israel is drawing retaliation against critical infrastructure cyber assets in the energy sector, according to an advisory issued by the Cybersecurity and Infrastructure Security Agency and other federal organizations.

CISA joined the FBI, Department of Energy, Environmental Protection Agency, National Security Agency and U.S. Cyber Command to warn that “Iran-affiliated” hackers have targeted programmable logic controllers (PLC) used by organizations in multiple critical infrastructure sectors including government services and facilities, water and energy.

The agencies identified similarities with a previous campaign by a pro-Iran hacking group that cybersecurity firms have given various names, such as CyberAv3ngers, Shahid Kaveh and Bauxite, and said a recently observed escalation of Iran-linked campaigns against the U.S. was “likely in response” to the conflict begun by the U.S. and Israel on Feb. 28. (See Dragos: Attacks on ICS Increased in 2024.)

PLCs are computer systems that constantly monitor the state of input devices and control the state of output devices. Controllers manufactured by Rockwell Automation under the Allen Bradley brand are known to have been targeted by the attackers, specifically the CompactLogix and Micro850 device lines. Other brands and manufacturers may have been targeted as well, according to the agencies, based on the directing of malicious traffic to network connection points used by companies other than Rockwell.

Intruders were observed to access the PLCs through “overseas-based IP addresses [using] leased, third-party hosted infrastructure” with Rockwell’s configuration software, which allowed them to create accepted connections to the targeted equipment. The advisory includes a list of IP addresses used by the threat actors and when they were observed.

According to the FBI, attackers used their access to extract the devices’ project files and manipulate data on human machine interface and supervisory control and data acquisition displays, causing “operational disruption and financial loss.”

The agencies provided a list of recommended mitigations to reduce the impact of intrusion attempts, corresponding with CISA’s recently updated cybersecurity performance goals. (See CISA Updates Critical Infrastructure Cyber Goals.)

Advice for defending organizations partly focused on reactions to suspected attacks. These include:

    • Disconnecting the affected PLC from the public-facing internet.
    • Switching the controller to “run” mode rather than “program” or “remote” to prevent modification, if possible.
    • Enabling programming protection to limit remote modification permissions, if available.
    • Backing up PLC logic and configurations offline in a secure location.

The authors also mentioned steps to strengthen the general security posture, such as implementing multifactor authentication for external access to the organization’s operational technology network, and tools like network proxies, gateways and virtual private networks to control access to the PLCs. Additional measures include keeping PLCs updated with the manufacturers’ latest software patches, disabling unused authentication measures and monitoring network traffic for suspicious content.

Despite the advice to network defenders, CISA and the other agencies emphasized “it is ultimately the responsibility of the device manufacturer to build products that are secure by design and default.” To accomplish this goal, they urged manufacturers to follow the principles in CISA’s Secure by Demand guidance, including changing default settings to prevent inadvertent exposure to the public internet, supporting phishing-resistant MFA methods and providing basic security features without additional fees.

The authors also recommended organizations test their security programs against threat behaviors identified in the ATT&CK matrix developed by engineering and information technology consultancy MITRE. They suggested testing security programs “at scale in a production environment to ensure optimal performance.”

Coal-fired Generation Retirements Slow Under Trump

Coal generation retirements dropped to a 15-year low in 2025 as the energy industry tried to maintain existing capacity and the Trump administration sought to halt coal’s decline.

The U.S. Energy Information Administration (EIA) reported April 13 that coal plant operators began 2025 with plans to retire 8.5 GW of capacity but then retired only 2.6 GW — or about 1.5% of the U.S. fleet:

    • Indian River Generating Station Unit 4 in Delaware (410 MW);
    • Cholla Units 1 and 3 in Arizona (383 MW);
    • Intermountain Power Project Units 1 and 2 in Utah (1,800 MW); and
    • Prairie Creek Unit 1 in Iowa (15 MW).

It was the least since 2010, EIA said, and is a small fraction of the total in 2022, when 13.7 GW of coal capacity was retired (about 6.5% of the U.S. fleet).

In 2025, operators canceled plans to shut down 1.1 GW of coal-fired capacity and deferred plans to retire 4.8 GW.

The deciding factor in some of the changes was the U.S. Department of Energy (DOE) acting in response to President Donald Trump’s Day 1 declaration of a national energy emergency, his vision of U.S. energy dominance and his initiative to reinvigorate “beautiful clean coal.”

DOE has kept several generating units from being retired through temporary orders under Section 202(c) of the Federal Power Act:

    • J.H. Campbell Units 1, 2 and 3 in Michigan (1,331 MW);
    • Transalta Centralia Unit 2 in Washington (670 MW);
    • R.M. Schahfer Units 17 and 18 in Indiana (722 MW);
    • F.B. Culley Unit 2 in Indiana (90 MW); and
    • Craig Unit 1 in Colorado (427 MW).

Environmental and ratepayer advocates have criticized the 202(c) orders because of the financial and environmental impacts of continuing the operation of these plants. Regulatory debates and litigation continue. (See States, Environmentalists Argue DOE is Usurping Authority via 202(c).)

Meanwhile, some operators decided to delay retirements of 2.2 GW of capacity that had been scheduled in 2025:

    • Brandon Shores in Maryland;
    • South Oak in Wisconsin; and
    • Comanche in Colorado.

The U.S. Energy Information Administration maps coal-fired power plant retirements deferred in 2025. | EIA

The EIA reports the energy industry plans to retire 6.4 GW of coal generation in 2026, or nearly 4% of the U.S. fleet, but notes that regulatory actions and economic factors could cause those plans to change. (See Coal’s Decline Slows Amid Demand Growth in 2026, Trump’s Support.)

Coal has been on a sharp, sustained decline in the 2000s in the U.S. power sector because of the advent of cheaper natural gas and imposition of stricter environmental regulations. Statistics previously produced by the EIA quantify the slide:

    • U.S. coal production has dropped from 1.17 billion short tons in 2008 to 513 million in 2024.
    • From 2015 through 2024, U.S. coal-fired generation dropped from 1,352 TWh to 652 TWh per year, with every year but one lower than the year before. (The total jumped to 737 TWh in 2025 amid higher gas prices.)
    • The number of U.S. coal-fired plants dropped from 491 in 2014 to 219 in 2024.
    • From 2015 through 2024, the time-adjusted capacity of the U.S. coal fleet dropped from 286 GW to 176 GW, and its capacity factor fell from 54.3% to 42.6%. (The fleet capacity factor also saw a rebound in 2025, jumping to 48.7%.)

IESO Clarifies Roles in Toronto Transmission Line Procurement

IESO has refined how it will work with the Ontario Energy Board (OEB) on the construction of a third transmission line into Toronto and broadened rules for prospective bidders to demonstrate their experience.

ISO officials disclosed the changes in an engagement session April 9, saying they were developed in response to stakeholder feedback on the estimated $1.5 billion HVDC line under Lake Ontario, the first transmission project IESO will award under competition.

Officials say the 65-kilometer, 900-MW Toronto Third Line (TTL) is needed to meet a potential doubling of Toronto’s electricity demand by 2050.

Andrew Lee, an advisor for IESO’s resource acquisition, transmission development and procurement unit, announced version two of the Transmission Selection Framework registry rules, published March 18. The revised rules are intended to provide “greater flexibility and additional clarity for prospective applicants,” while ensuring those seeking to participate “have demonstrated a baseline level of experience and competency,” Lee said.

The registry window will be open until about 60 days prior to anticipated launch of the procurement, currently set for Q1 2027.

The new rules will allow an applicant to cite more than one designated affiliate to demonstrate its organizational experience.

“That change was made in response to feedback and reflection on how corporate groups are often structured in practice,” Lee said. “In some cases, relevant experience may sit across multiple affiliate entities rather than neatly within a single one. Allowing reliance on more than one designated affiliate introduces flexibility and better reflects those realities.”

Limited HVDC Experience

Stephen Lachan, a supervisor for resource acquisition, transmission development and procurement, said the ISO will be flexible in evaluating the experience of prospective bidders, noting there are “only a handful of [HVDC] projects in North America.”

“The types of qualifying projects that we could consider include … an underwater HVDC transmission, underwater transmission in general [including AC lines], underwater linear infrastructure [such as] telephone cables or pipelines, or overland HVDC cables,” he said.

“The ISO is still considering the balance between stringent qualifying project requirements and increased market participation, and this is where we’re seeking feedback from you all today,” he added.

Patten Energy’s Frank Davis questioned whether the ISO’s requirement that bidders demonstrate their capacity by having at least two “directors/officers with [qualifying project] experience” will be “definitive.”

“I believe in prior procurements … there was a bit of a wider net cast, like language like ‘managerial authority,’ as opposed to specific roles,” Davis said.

Past procurements required that the director/officer can bind the proponent organization, Lachan said.

“Much what we’re putting out today is … our initial proposal,” he said. “So, we do welcome feedback, and if there [are] distinctions that you think we should take into consideration as it relates to how we characterize a director officer within a proponent entity, I think we’re happy to take that on.”

IESO, OEB Roles

Nicole Kosonen, supervisor of resource acquisition, transmission development and procurement, said the ISO has dropped its initial model, in which the winning bidder would have received a contract covering all costs of financing, designing, building, operating and maintaining the line for the first 10 years of commercial operation. In year 11, the contract would have transitioned to traditional rate regulation under the OEB, which would approve annual revenue requirements for continued operation. (See IESO Removes Credit Requirement for Transmission Registry.)

Instead, Kosonen said IESO will select the bidder and approve its capital cost requirements, then turn the rate regulation over to the OEB when the design and build phase is complete. OEB will oversee the operational phase and approve costs related to operation, maintenance and sustaining capital.

Capital cost components excluded from and eligible for risk sharing. | IESO

“The benefit of this approach is that it enables the ISO to have oversight over the entirety of the capital costs, reducing the risk of shifting costs between the [ISO] contract and the OEB. The boundaries of the ISO contract and the OEB oversight are relatively straightforward, which will minimize implementation challenges,” Kosonen said.

“The actual determination of rates and payments to the transmitter will be determined by the OEB through their rate setting,” she added. “No contractual payments will be made from the ISO to the transmitter, and the transmitter will not start receiving payments until the line is in operation.”

Ceiling Price vs. Target Price 

To control costs, IESO is considering allowing bidders to propose a ceiling price, with any costs above the ceiling borne by the transmission company. It is also considering allowing developers to propose a target price for certain capital cost components, such as permitting and licensing, HVDC materials and equipment and connection costs. “If the transmitter is able to perform the scope of the activity governed by that target price for less than the contracted amount, then the difference is shared” between the developer and ratepayers, Kosonen explained.

‘Bankability’ Question

Adam Butterfield, Mott MacDonald’s practice leader for energy infrastructure in Canada, questioned whether the TTL project would be “bankable.”

“The linear infrastructure procurement world has kind of moved away from all-in fixed price [contracts] and is kind of looking more toward alliance or progressive design builds, where bidders put in their costs for the known amounts … and then once the development phase is completed and costs are a little bit more known, then there’s a negotiation on the actual construction cost.”

“I just worry that at the early stage you’re going to market with this, that bids are going to be based off wildly different assumptions,” he continued. “For example, no one knows what the subsurface conditions are along the route. So, you don’t even know how long your cable is going to be. We don’t even know where it’s going to tie in to the converter stations on either end. So, unless you provide a strawman for bidders to bid on, it’s going to be garbage in, garbage out. Everyone will put in a total bid price, but they’ll be vastly different, because [there] are just too many unknowns at this stage.”

Lachan said although the ISO is proposing to have all the capital costs subject to the IESO contract “that does not mean that we are suggesting that all of the capital costs … would be fixed. We are seeking to get a better understanding of which of the cost categories for this type of project need to have some sort of reasonable change provisions.”

Leave to Construct, ‘Engagement Fatigue’

Lachan said the ISO is considering whether the OEB’s “leave to construct” process — in which the regulator determines whether a project is in the public interest — is appropriate for the TTL project, “understanding that the ISO will be doing a competitive procurement where … a lot of the aspects of the regulatory review associated with the LTC will be captured.”

Denise Zhong, senior manager of transmission development and procurement, said the ISO has developed a digital guide to help municipal officials in the path of the TTL learn the basics of transmission and procurement concepts.

“I want to acknowledge …  that ‘engagement fatigue’ is real,” she said. “Many communities and stakeholders are being asked to participate on multiple initiatives at the same time, often on complex technical topics. On top of that, the electricity sector itself can be difficult to navigate at times, particularly for those that are newer to transmission or competitive procurement processes. The intent is to support meaningful participation … without requiring people to become subject matter experts overnight.”

Next Steps

The IESO asked for the next round of written feedback by April 23 at engagement@ieso.ca. It said it will hold the next engagement session in May or June on refined procurement and contract concepts and documents.

A New Metric Related to Data Centers and Electricity that May Matter

As the astonishing and unanticipated tsunami of AI-driven data centers washes across the electric utility landscape, the conversation continues to evolve — especially as it relates to the rest of us who share that power grid. It may soon be that data centers will rule the electric world, while the rest of us only live in it. And a proposed new metric — the “compute heat rate” — soon may make that abundantly clear.

There has been a great deal of discussion related to data center loads and their impact on other ratepayers, with a recent and growing focus on the flexibility of operations. The argument for flexibility is that data centers wishing to interconnect to a capacity-constrained grid could reduce additional grid stress if they were able to interact more flexibly with it.

In theory, this could be done in two ways — through the addition of energy storage or through flexible compute operations. A recent paper on flexible operation of data centers notes that most grid scarcity events are relatively short, so adopting such approaches could significantly reduce grid stresses.

To that end, leading chipmaker NVIDIA and software company Emerald AI are pushing an initiative to “power and advance a new class of AI factories” that could connect to the grid faster and “support the grid” through flexible operation.

NVIDIA says it will employ a new reference design to help modulate demand and coordinate flexible load, while Emerald AI’s software platform will orchestrate the required computational flexibility. With this power-flexible approach, NVIDIA claims up to 100 GW of capacity could be freed up in the U.S. grid.

Flexible Operation Might Just be the Cost of Admission

A 2025 study from Duke University’s Nicholas Institute generally supports that 100-GW figure, calculating that with an average annual load curtailment rate of 0.5%, 98 GW of new load could be added to the grid. A 2026 follow-on study stated that flexible operation can feasibly reduce system costs by tens of billions of dollars over the next decade, lowering electricity prices for all participants, by flowing more megawatt-hours across the system without putting additional pressure on system peaks.

To date, there is limited evidence that flexible operation is possible. In July 2025, as part of the Electric Power Research Institute’s (EPRI) DCFlex program, an Emerald AI-guided data center cut power use by 25% during three hours of peak grid demand while maintaining AI compute quality. As of late March, Emerald AI claimed to have proven similar flexible operations at five commercial data centers. It’s also collaborating with PJM, Digital Realty, NVIDIA and EPRI on a flexibility initiative in NVIDIA’s 96-MW Aurora data center in Virginia, expected to be online by mid-2026.

Peter Kelly-Detwiler

During the recent CERAWeek, EPRI unveiled Flex MOSAIC — a common framework shared by 30 institutions for classifying large load flexibility to help develop shared expectations across the industry as to how flexibility will be used.

The approach would focus on performance characteristics, such as notification time, duration, frequency of use, depth of load adjustment, ramp behavior and availability — in other words, pretty much what professionals in the demand response world have been asking from enrolled DR assets for the past two decades. Put another way, the animal must perform pretty much the same old trick, but it matters more when it’s done by an elephant than a mouse.

It makes perfect sense for a data center to adopt this approach, especially if that’s the price they must pay for being invited to the grid. That invitation is pretty valuable these days, with “speed to power” being the critical imperative.

Astrid Atkinson, the CEO of Camus Energy — a company devoted to promoting flexible grid interconnections — recently said that “our number for the opportunity cost or corresponding value of getting a data center online a year earlier is about $7 billion for a gigawatt of capacity per year.”  (Yes, you read that correctly: $7 billion in value for the ability to access power 365 days earlier — a little over $19 million/day.)

Just Don’t Expect Data Loads to Curtail Voluntarily

In theory, some of these new AI-focused data centers can operate at least part of their load flexibly. However, we don’t know what this means for the massive 100-plus-MW data centers that host large language training models, versus those smaller data centers serving AI inference loads that put those models to work in the real world.

There is reason to suspect, though, that their willingness to operate flexibly within the power grid might not be as high as one might hope, and they may have to be contractually bound to do so.

That’s because once they are running and performing their alchemy — employing silicon to weave data and electricity into valuable information — these AI factories have extremely high opportunity costs. The value of their output may in fact be so high that they are willing to shut down only at astronomical prices.

This concept of AI data center price elasticity is relatively new. We generally know what it is for price-sensitive cryptocurrency miners. They are estimated by the Energy Information Administration to curtail electricity use at a price around $100/MWh, depending on the prevailing price for the currency. But that price sensitivity has only recently begun to be explored for other data loads.

These Aggregated Loads are Too Big to Ignore

If these loads were insignificant, such price elasticities might not matter much. But since those loads are enormous and coming quickly, they do matter — a lot. As of April 1, for example, ERCOT is “tracking approximately 410 GW of large loads seeking interconnection, of which about 87% are data centers.” To put that in perspective, ERCOT’s record peak demand to date was 85,500 MW.

| Camus / Encoord / Zero Lab

We’ve already seen the destabilizing impacts of forecast data loads in PJM, where the past three capacity markets have been completely upended by the addition of forecast data loads. An estimated $23 billion of additional capacity revenues are estimated to come from existing and projected data loads. That’s capacity, where suppliers play in the market.

But what about energy markets, where buyers step in? How much are AI data centers willing to pay for power before they shut down? And what impact might that have on everybody else in the market?

A New Metric Defining Opportunity Costs and Willingness to Pay for Power

Some fascinating work is being done in this area by Hans Royal, an industry veteran who recently published a paper introducing the concept of a compute heat rate (CHR). The CHR is a metric that attempts to quantify “the maximum electricity price a data center operator can rationally pay before the computation running on that electricity becomes uneconomic.” It’s a way to measure price sensitivities.

For decades, the energy industry has been using the concept of a heat rate as a measure of a power plant’s thermal efficiency in converting a fuel’s heat potential into electricity. The heat rate is defined as the Btu in a fuel required to generate 1 kWh of electric energy.

Heat rates are commonly employed, with traders and plant operators using them to calculate the “spark spread” — the estimated profitability of a plant based on the prevailing prices of gas and electricity. The lower the heat rate, the more efficient the power plant is.

Royal notes that in addition to crypto miners, other traditional large loads — such as aluminum smelters, steel producers and chemical manufacturers — also exhibit price sensitivities. Their economics won’t justify higher prices, and depending on the industry they typically shut down at “relatively low price thresholds” that he quantifies as between $40 and $120/MWh, in effect creating a “demand-side brake” that works as a self-correcting price mechanism.

However, that doesn’t remotely apply to data centers. If accelerating 1 GW of data center capacity for one year is worth $7 billion, then clearly AI data centers, and their related energy economics, are like nothing the industry has ever seen.

Calculating the CHR

The CHR is aimed at illuminating AI data center price elasticities to gain a better sense as to how such loads could affect power markets. It’s calculated as follows: CHRw = (Rw – Cnon-elec) / (1 + m)

Rw is defined as the gross revenue per megawatt-hour of electricity consumed by workload type w. The value is “derived from API pricing, cloud compute rates or enterprise contract values.” It’s then converted to per-megawatt-hour revenue using GPU power consumption and throughput data. So, the more valuable the computing task, the greater the willingness to pay a higher price. In other words, the load becomes less elastic.

Cnon-elec are the operating costs per megawatt-hour not specifically related to power, including amortization of GPUs, facility overhead, cooling infrastructure, networking and maintenance. These come from published infrastructure cost data and industry benchmarks.

m is the required return margin, representing the minimum required profit margin. Here, the baseline assumption is 0.30 (30%).

The astonishing finding from Royal’s analysis is that “the blended CHR of approximately $6,350/MWh implies that AI data center demand will not curtail at electricity prices below roughly 127 times the current wholesale average.”

Royal notes that different AI workloads may have vastly different price tolerances. His initial estimates have them ranging from about $500/MWh for training of a frontier large language training model to over $53,000/MWh for frontier inference services.

Remember, this would apply to tens of gigawatts of future loads that are just now being forecast and built. He concludes that with their high opportunity costs, these massive loads “will not curtail at any price level observed in U.S. wholesale markets” (my emphasis).

Once Certain Penetration Levels are Achieved, Watch out

These price tolerances are not an issue when supply is unconstrained, but as the demand-supply imbalance increases, the CHR dynamic increasingly applies, especially in localized and transmission-constrained areas. Royal postulates that these kinds of extreme prices could be seen once critical penetration thresholds are reached, in much the same way the California energy market shifted with the duck curve. Once sufficient solar was added, price formation was structurally affected.

With data load, he suggests, the CHR effect would emerge “suddenly once sufficient data center infrastructure reaches critical mass at specific grid nodes.”

That finding has significant implications for grid operators, utility planners and regulators, as well as other electricity consumers who may find themselves priced out in various markets. The CHR metric may require additional vetting — something that is beyond my limited background and ability. But the concept fascinates me, and it’s worth further exploration, as this rapid and massive new AI demand is something we simply have never seen before.

This metric may offer great value in tracking costs by location and over time as the use of AI accelerates and the technology continues to evolve. It might even offer future economic utility value as a hedging instrument. Only time will tell.

Grid Largely Immune from Oil Price Shocks; But We Can’t Ignore Them

Some Americans still think of “energy prices” as a single rising tide. When oil prices top $100/barrel, it feels like everything else follows: heating, groceries and electricity.

For Boomers who filled their cars’ voluminous tank during the 1970s energy crisis, that intuition is grounded in experience: They lived through a time when oil shocks rippled across the entire energy system, driving inflation and straining household budgets.

Today, for Americans in the Lower 48, correlation is not causation. Yes, electricity prices are painfully high. No, they’re not related to oil prices, at least for customers of the RTOs.

Oil prices can surge, and most of the U.S. power system barely notices. Yet that doesn’t mean oil no longer matters. It means the places where it still does — from Hawaii and Alaska to island grids around the world — tell us something important about how electricity prices work and why so many Americans still feel like they’re losing control of their energy bills.

The Persistence of the ‘Energy Bucket’

For many households, energy is experienced as a single category.

Consumers are feeling squeezed by rising costs: April 10 inflation data showed a threefold spike in prices driven by oil. Although U.S. retail electricity prices largely are unrelated to the oil price, they have been rising faster than inflation in recent years and are expected to continue increasing. Just two months ago, I wrote about the political risk inherent in households’ rising electric bills.

Gasoline prices are posted on street corners and updated daily. Electricity and gas bills arrive monthly. Heating oil tanks are filled periodically. And since the U.S. and Israel attacked Iran at the end of February, there’s been increased upward pressure. When all rise — even for different reasons — the result is a powerful perception: Energy is getting more expensive, everywhere, all at once.

Fortunately for the industry, many consumers have mentally decoupled oil prices in the news from their electricity bill. Unfortunately, they largely blame utility profits, infrastructure upgrades and, in third place, data centers’ impact on demand, according to a recent Consumer Reports survey.

How the U.S. Grid Broke its Link to Oil

Oil played a meaningful role in U.S. electricity generation 50 years ago. Between 1963 and 1973, oil’s share of electricity generation almost tripled, rising from 5.73% to 17.09%. More importantly, it played a central role in the broader energy economy. When oil prices moved, everything else tended to move with them.

That is no longer the case. The 1973/74 energy crisis was a wakeup call, and though oil use touched a little higher in 1977, the hunt was on for alternatives. By 1984, it was back below 1963 levels and has since sunk to a negligible amount: less than 1.0% since 2011.

The generation mix has been fundamentally reshaped since the mid-1970s, and more than once. After the energy crisis, nuclear and coal output rose. Nuclear — slow and expensive to build and facing community acceptance headwinds following the 1979 Three Mile Island accident — stalled at about 20% of electricity generation. Coal kept climbing, with half of all electricity produced by coal-burning plants in 2005.

Globally, coal and gas are the leading sources of electricity generation. | Our World in Data

Then the shale revolution challenged coal’s dominance, and coal’s use declined sharply as natural gas surged. The rise of renewables reshaped the market again in the past decade or so, with wind, solar and hydro now supplying more than 20% of U.S. electricity.

What Actually Drives Electricity Prices Now

If oil no longer drives electricity prices, what does?

Today, more than 40% of U.S. electricity comes from gas, which frequently sets the marginal price in wholesale markets. Renewables continue to grow if you look at trends longer than a political administration, adding capacity but not displacing gas as the price-setting resource in most regions — though a sunny day in California or a windy night in Texas can push the wholesale price of electricity negative.

But retail electricity prices are shaped by far more than fuel. It’s no longer what we burn (or capture, when it comes to renewables) but what we build — and rebuild after disasters — that drives the retail cost of delivered electricity. In 2023, the U.S. Energy Information Administration said transmission accounted for 12% and distribution accounted for 26% of the cost of retail electricity.

This shift helps explain a paradox that frustrates consumers: Electricity prices can rise steadily even when fuel prices are stable or falling.

Oil Still Matters, Just Not Everywhere

In some parts of the world, oil still plays a significant role in electricity generation — particularly in island grids, remote systems and regions with limited access to natural gas infrastructure. Diesel and fuel oil remain common fuels for power generation in parts of the Caribbean, Africa and South Asia.

Even where oil is not the primary fuel, it often plays a critical backup role. In systems with unreliable grids or constrained infrastructure, diesel generators provide essential capacity, tying electricity costs more directly to oil prices.

And oil still matters indirectly everywhere. As the Brookings Institution has noted, oil shocks continue to ripple through the global economy, affecting inflation, supply chains and household budgets.

The key distinction is not between oil and non-oil systems, it is between grids that are structurally insulated from fuel volatility, and those that are not.

No Person is an Island … but Some U.S. Grids are

That distinction exists within the United States as well. Not all U.S. grids look like PJM or ERCOT. Some look a lot more like the Caribbean.

Oil makes up a significant part of the electricity generation mix in two states — Hawaii (65%) and Alaska (15%) — and most territories, such as American Samoa (97%) and Puerto Rico (62%).

Hawaii has consistently had some of the highest retail electricity prices among the 50 states, and its utilities know their customers will feel the pain of rising oil prices. Hawaiian Electric has warned customers that higher bills are on their way, forecasting residential bills may rise 20% to 30% over the next several months.

The sensitivity to global oil prices is just one more reason for the islands to continue their push for resilience by expanding the use of renewables. “Hawaiian Electric has reduced its use of oil by 55 million gallons annually since 2008 and is bringing more than a dozen fixed-price renewable energy projects online in the coming years,” the recent announcement said.

Alaska presents a different version of the same challenge. Many remote communities rely on diesel-powered microgrids, where fuel costs are a primary driver of electricity prices. These systems behave less like the interconnected grids of the Lower 48 and more like the island and remote systems found elsewhere in the world.

New Headlines, Same Conclusion

Oil no longer sets the price of electricity in most of the United States, but oil shocks shape how much end users think about the cost of all their energy. Because of that, it’s critical that policymakers keep electricity prices and the perceived value of the services customers receive top of mind.

While those inside the industry recognize electricity prices are the outcome of infrastructure, investment and system design as much as fuel costs, customers care less about the why and more about the how much. And just because the price of oil isn’t driving most customers’ bills higher, it is making them increasingly price sensitive.

It’s self-referential to quote my conclusion from a couple of months ago, but the conclusion of this column is the same as the last one on prices, only now bolded and underlined thanks to oil prices: Utilities, grid operators and regulators have to focus on affordability. This means balancing necessary long-term capital investments with measures that keep electricity costs manageable.

Oil price shocks may settle soon or may continue if global oil supply can’t reach markets, but there’s no sign that customer concerns about the price of energy will settle any time soon.

MISO Plans to Change Accounting Practices as Record Queue Exits Could Raise Rates

A record number of project withdrawals in MISO’s generator interconnection queue has become so consequential that the RTO filed with FERC to alter its accounting practices (ER26-1972).

MISO proposed a tariff change to FERC that would allow it to record the interest generated from its interconnection deposits and fees that it needs to return as a deferred expense rather than an immediate one.

The RTO has experienced a record number of withdrawals from its generator interconnection queue in 2025 and so far in 2026, reducing the waiting line from a nearly 300-GW high in 2025 to 192 GW in early 2026. MISO said it expects the exodus to continue over 2026. It said more than 800 projects dropped out of the queue in January 2026 alone.

Consequently, MISO said it is sitting on more than $2 billion in deposits that must be returned with interest to projects exiting its queue. If the grid operator paid out all interest and recorded it as an expense all at once, it could cause the rates it charges to its members to spike.

MISO asked FERC for permission to instead spread that cost out over time by recording interest in a holding account and making deliberate, $7 million/month payments until it pays off what is owed to smooth the potential impact on rates.

“The proposed accounting treatment will stabilize the impact of the interest expense on rates paid by MISO’s customers,” the RTO told FERC in its March 30 filing.

MISO said it needs the change in place to “protect customers from unnecessary rate volatility.” It intends to continue the $7 million/month toll for one year following the completion of its 2026 queue cycle.

The RTO emphasized that its plan would involve only an internal accounting change. It said withdrawing interconnection customers still would get refunds with interest.

“To be clear, payments of both principal and interest will be made to interconnection customers consistent with MISO’s standard practices and procedures, as required, with only the accounting treatment being deferred,” it wrote.

MISO confirmed to RTO Insider that it’s holding a cash balance on the 800 projects that dropped out in January. It estimates that it will have to refund $60 million in interest over 2026.

The RTO said that although it is working “as quickly as possible” to complete the refunds, “timing will vary” on when interconnection customers are refunded. It did not offer an average length of time for interconnection customers to receive funds.

MISO also said it produces “monthly statements that show the expenses against the interconnection customer’s study deposits.” However, it said, “interest is calculated and communicated to the interconnection customer when the refund is processed.”

In a statement to RTO Insider, MISO stressed that its FERC filing has no bearing on its processing of refunds to interconnection customers.

The grid operator wrote to FERC that it hopes it can secure approval for the accounting deferral in time to finalize its first-quarter 2026 financial statements by May 30 with Deloitte, its auditing firm. It asked FERC for an effective date of March 31.

MISO collects initial study deposits and per-megawatt milestone fees from project developers for their projects to enter the queue and remain in line as they are studied. It holds the money in interest-bearing accounts, which earned interest impressively over the past few years of high rates.

The RTO must refund unused deposits and payments, with interest, to interconnection customers when they withdraw projects from the queue. It records deposits and fees as a short-term liability on its balance sheet and debits a dedicated account to refund developers. Ordinarily, when MISO pays out principal and interest, it is recorded as an immediate expense.

MISO said it uses earned interest income “for the benefit of members,” offsetting the rate it charges to cover its operating costs. Because its operating costs are passed on to members, a massive interest obligation paid out instantaneously would shock rates, it said.

In the past three years of financial presentations delivered to its Board of Directors, MISO leadership underscored the high interest it was accumulating and did not publicly discuss potential rate liabilities associated with returning interest.

In its FERC filing, MISO said its queue began ballooning in 2020, when 52 GW of new projects lined up. The pattern continued in 2021, when 77 GW signed up, and again in 2022, when 171 GW worth of projects entered the submission window.

The volumes created study logjams, prompting MISO to roll out stricter rules and higher fees to discourage speculative generation proposals. The rules, combined with the White House’s discontinuation of renewable energy tax credits, have shrunk the queue.

MISO said a “marked increase in project withdrawals” led to “much larger than anticipated refunds of deposits and payments plus interest.”

Only approximately 20% or fewer of the projects in MISO’s queue historically reach the interconnection agreement stage.

The RTO said FERC has allowed it to defer accounting before, specifically when Entergy joined in 2013 to create MISO South and again in 2011, when failure of a cooling fan system necessitated extensive repairs to its Carmel, Ind., control room.

Calif. Bill Would Allow Hourly RA Trading for Slice-of-Day Requirements

A California Senate committee has advanced a bill that would allow load-serving entities to trade capacity on an hourly basis to meet the state’s slice-of-day resource adequacy requirements.

The Senate Energy, Utilities and Communications Committee voted April 7 to pass Senate Bill 1138 by Sen. Steve Padilla (D). The bill now heads to the Senate Appropriations Committee.

SB 1138 would require the California Public Utilities Commission to allow load-serving entities (LSEs) to satisfy 25% of their resource adequacy requirements by trading energy capacity with other LSEs “in the same unit of time used to denominate resource adequacy compliance requirements.”

In the case of the CPUC’s slice-of-day RA requirements, that unit of time is hourly. The slice-of-day framework requires LSEs to demonstrate sufficient capacity to meet the peak forecast demand in each hour of the peak day in each month.

To submit a commentary on this topic, email forum@rtoinsider.com.

Load-serving entities covered by the bill include investor-owned utilities, community choice aggregators and electric service provides.

Padilla accepted a committee amendment that would allow the CPUC to adjust or eliminate hourly trading if it turns out to be hurting reliability.

CPUC implemented the slice-of-day framework in 2025 as part of an RA reform process. It was the first program of its kind in the U.S.

Although the new framework better aligns the RA program with system reliability needs, “RA products must transact monthly even though the obligations are unique to each hour,” said Lauren Carr, senior manager of regulatory affairs for the California Community Choice Association, the bill’s sponsor, which in 2025 released a study outlining the benefits of hourly trading of RA. (See CalCCA Study Touts Benefits of RA Trading at Hourly Level.)

As a result, LSEs must buy more RA than they need, and RA prices are driven up, Carr told the Senate committee. Those added costs fall on ratepayers, who could have saved $180 million in 2025 had hourly trading been in place, Carr said.

For the Clean Power Alliance, a Southern California CCA, resource adequacy accounted for about a quarter of energy costs in 2025 — or around $287 million, said CEO Ted Bardacke. CPA could have saved $10 million in 2025 through hourly trading, which is roughly the CCA’s budget for customer programs.

“This is real money that could be invested if we could just be more efficient,” Bardacke said. “Please allow us to be more efficient.”

A CalCCA analysis found that through hourly load obligation trading, LSEs could have avoided $105 million in excess RA purchases for summer 2025. And the reduced demand could reduce RA costs by an additional $77 million per year through downward pressure on prices, CalCCA said.

In February 2026, CPUC addressed the hourly trading issue through a Report on Transactability within the Slice of Day Resource Adequacy Framework.

The report acknowledged potential benefits of hourly trading but said “several factors limit the extent to which these theoretical savings could be realized in practice.”

“Portfolio outcomes are highly dependent on individual LSE load profiles, resource characteristics, and contracting strategies,” the CPUC report said.

And from a feasibility perspective, hourly trading would complicate the RA compliance process, the report said.

“Given the limited evidence of need, uncertain magnitude of benefits, and heightened implementation risks, staff concludes that the potential gains do not outweigh the added complexity and risk of unintended consequences,” the report concluded. It recommended continued monitoring of market performance as the slice-of-day framework matures.

CCAs responded by saying the report underestimated the potential cost benefits of hourly trading.

FERC Approves CAISO EDAM ‘Transitional’ Intertie Proposal

FERC approved a series of revisions related to the design of CAISO’s Extended Day-Ahead Market to support market implementation and avoid disruptions to existing contracts.

A primary issue in FERC’s April 8 order concerned intertie modeling and scheduling under EDAM (ER26-1294).

When EDAM launches on May 1 with PacifiCorp as the first participant, what were formerly two transfer points between the CAISO and PacifiCorp balancing authority areas will become internal EDAM interties, which would no longer function as scheduling points for the CAISO BAA.

This raised concerns among stakeholders, who argued that the elimination of the scheduling points could “disrupt forward contracts for resource adequacy supply or resources supporting renewable portfolio standard transactions, as well as commercial transactions to hedge the risk of congestion,” according to the order.

CAISO proposed a “transitional measure” to continue to allow import bidding limited to RA contracts and RPS transactions at those internal EDAM transfer locations.

“CAISO states that it intends to extend this transitional measure until at least the end of 2026, and that it plans to hold stakeholder discussions this year to further discuss intertie scheduling modeling to explore the mechanics of phasing out this transitional measure,” FERC wrote in approving the proposal.

CAISO also proposed, on a transitional basis, continuing to model non-resource-specific system resources under current practices.

Intertie resources in CAISO are modeled at specific scheduling points, but the ISO in 2025 proposed to model intertie resources under EDAM at generation aggregation points (GAPs). A GAP is the collection of supply resources in a BAA or group of BAAs.

The GAP approach would have improved power flow and market accuracy significantly, improved alignment with actual power flows by reducing phantom congestion and reduced operator conformance of transmission limits in real time.

In November 2025, stakeholders raised concerns about the GAP approach, saying it could lead to a market with multiple prices for the same intertie. Based on those concerns, CAISO proposed retaining pre-EDAM practices for CAISO intertie transactions for a time while it continues discussions with stakeholders. (See EDAM Intertie Scheduling Processes Raise Stakeholder Concerns.)

FERC approved the intertie proposals, saying, “We find that extending a limited option for bidding at CAISO interties that are EDAM transfer locations would give market participants time to gain familiarity with the EDAM design and adjust commercial arrangements that may rely on current scheduling approaches for intertie resources.”

“We also find that this proposal would help avoid adverse market or reliability outcomes that could result from disrupting existing contractual arrangements while CAISO continues its discussions with stakeholders about intertie scheduling modeling,” the order stated. “Accordingly, we find that the continued intertie bidding proposal is just and reasonable and not unduly discriminatory or preferential.”

FERC approved clarifications on how the 15-minute market and real-time dispatch would treat schedules cleared in the hour-ahead scheduling process at EDAM and WEIM transfer locations.

The commission also approved a slate of other revisions, including clarifications on congestion revenue rights, greenhouse gas accounting, the day-ahead contingency analysis tool, market information sharing, transmission charges and timing of price corrections.

TeraWulf’s Data Center Plans Draw Protests in FERC Review of Power Plant Purchase

A FERC proceeding seeking approval to purchase an old oil-fired power plant in southern Maryland has drawn multiple protests because its buyer wants to co-locate a data center (EC26-58).

Data center development company TeraWulf announced plans to buy the Morgantown Generating Station, a 216-MW plant made up of four units, in February and asked FERC for approval by April 2. But the commission can take several more months before acting, and numerous filings opposing the deal have given it an ample record to review.

The company’s business model is to turn old industrial sites into data centers. According to a news release on the deal, the company said it planned to add 500 MW to the site initially, which would support a data center that will have demand of 1 GW.

In its protest — the first to be filed — PJM’s Independent Market Monitor argued that the Pepco zone, where the plant is located, is constrained and needs to preserve existing generators while adding new capacity.

To submit a commentary on this topic, email forum@rtoinsider.com.

“TeraWulf’s plans for new data center load at Morgantown fail to address whether the Morgantown units will be removed from the PJM capacity market to serve data center load,” the Monitor said.

The company is not new to the PJM market: It had planned to build a Bitcoin mine at Talen Energy’s Susquehanna nuclear plant, which set off a major dispute on co-location during the Biden administration. (See FERC Rejects Expansion of Co-located Data Center at Susquehanna Nuclear Power.) TeraWulf has since sold its share in a related joint venture to Talen.

The Monitor argued that TeraWulf should be required to disclose any plans for the Morgantown units and commit to provide a notice of material changes if the deal goes through, given that PJM is going to be short of its reserve margin in June 2027.

“There has been significant interest from owners of existing capacity and from data center operators in acquiring existing PJM capacity resources and diverting them from wholesale market participation to instead serve onsite load under co-location or power purchase agreement structures, rather than offering that capacity into the PJM market,” the IMM said. “This strategy shifts the reliability risk from data centers to all other PJM customers.”

TeraWulf said it plans to operate the existing units as net positive energy suppliers, so its ability to allocate output between data center load and the market could affect prices, the Monitor said.

FERC received many other protests on the application, including from individual citizens, the Maryland Office of People’s Counsel and the Sierra Club.

The OPC’s protest echoed concerns from the Monitor about market power and noted that the filing lacks any guarantees about actually building new supply at the Morgantown plant.

“A bring-your-own-new-generation (BYONG) approach can be pro-competitive if done correctly,” the office wrote. “But the PJM market rules on BYONG are currently in flux, and execution of BYONG will require nuance to ensure that the supply and demand balance within PJM is not disrupted. The proposed transaction’s lack of detail regarding applicant’s BYONG approach does not inspire confidence that its approach will be procompetitive. It is unclear how or if the proposed 500 MW would participate in PJM markets.”

Morgantown is home to two retired coal units that produced 1,299 MW. The OPC said it is worried those might restart after having been retired earlier this decade after running for 50 years.

The Sierra Club, which has long been campaigning to close coal plants, also echoed that sentiment.

“The transaction encompasses the site where two large coal-fired units were retired in 2022, and TeraWulf’s CEO has publicly discussed plans to ‘repower’ the coal units, which would have severe implications for the health of an already-overburdened community,” the organization said. “The application says nothing about these plans, about environmental obligations at the site, about the impact of intensified generation operations on the surrounding community, or about the impacts to Maryland’s clean energy goals.”

Even running the oil plants more often to provide power to a co-located data center will increase pollution in what is “among the most environmentally burdened” communities in Maryland, it added. Restarting the coal units would reimpose additional health burdens on that community.

TeraWulf pushed back on the protests by arguing that the only issue in front of FERC is whether to approve its purchase of the plant.

“The transaction will have no adverse effect on competition, rates or regulation or result in any cross-subsidization concerns and is therefore consistent with the public interest,” it said.

The proper venue for debating whether to site a data center at the power plant is at the state and local level, not in front of FERC, TeraWulf said. But any changes to the plant’s interconnection rights or capacity rights under a co-location arrangement would be subject to another case at FERC.

“This proceeding is not the time nor the place to raise such arguments, and the commission’s entertaining of such arguments in this proceeding that are generally applicable to the industry as a whole would be unduly discriminatory to the applicant,” TeraWulf said.

The Monitor responded to TeraWulf by saying the development plans at the site directly implicate market power questions that FERC needs to address in the proceeding.

“The current market power mitigation rules in the PJM tariff do not explicitly address the removal of capacity resources from the capacity market to serve data center load,” it argued. “Unless and until the market rules change, market power issues must be addressed case by case.”

After the initial back-and-forth, Public Citizen, the NAACP and the Port Tobacco River Conservancy filed a motion to dismiss April 1, claiming TeraWulf failed to disclose an equity stake Google has in the company. The tech giant signed a deal involving two other data center projects TeraWulf owns in New York and Texas.

“Both projects involve a three-party framework in which Fluidstack, a private AI cloud company, serves as the primary tenant of TeraWulf’s Lake Mariner and Abernathy projects while Google backstops Fluidstack’s lease obligations and certain loan commitments,” the groups said. “In exchange for taking on those obligations, Google obtained warrants controlling 73.5 million TeraWulf voting shares — equal to 14% of TeraWulf’s equity. Google received those shares at a strike price of 1 cent/share.”

The groups said the complex financial arrangement is to keep liabilities and Google’s potential regulations off its books.

“Google obtains strategic control over AI infrastructure capacity without directly owning real estate, building data centers or appearing as a regulated utility — while TeraWulf carries construction and operational risk,” they argued. “Google backs Fluidstack’s lease obligations but does not have to recognize them as a liability on its books — while Google obtains nearly zero-cost control over 14% of TeraWulf’s equity.”

TeraWulf responded to the motion to dismiss April 8, saying that deal does not mean Google is its part owner. Two deals with Google were publicly disclosed in a filing with the Securities and Exchange Commission, under which TeraWulf issued “warrants” giving Google the right to buy shares of its common stock, it said.

“Google holds warrants that provide a contingent right to purchase shares of TeraWulf common stock in the future, subject to specified terms and conditions,” it told FERC. “The Google warrants do not confer any present ownership interest, voting rights or control over” the company.

Google is not stockholder unless it exercises its rights under those warrants, meaning it does not qualify as an affiliate of TeraWulf. FERC defines voting securities for its affiliate rules as “any security presently entitling the owner or holder thereof to vote in the direction or management of the affairs of a company,” the data center developer noted.

ISO-NE Wholesale Costs Subside in March After Costly Winter

After a winter of record prices, ISO-NE wholesale market values fell back to more typical levels in March amid milder temperatures and lower natural gas prices.

Energy market value totaled $533 million in March, $7 million higher than the total value in March 2025. Ancillary service market value totaled $9.6 million compared to $7.1 million in March 2025.

The monthly peak totaled 17,861 MW on March 3, marking the highest March peak load experienced since 2019.

“March was relatively boring relative to the winter we just experienced,” said Stephen George, vice president of system and market operations at ISO-NE, at the NEPOOL Participants Committee on April 9. (See 2025/26 Most Expensive Winter in History of ISO-NE Markets.)

He noted that ISO-NE experienced some difficulty accurately forecasting solar production, which can significantly affect daytime demand due to the increasing amounts of behind-the-meter (BTM) solar in the region.

“The rate of growth of PV in the region continues to outpace our ability to forecast it as well as we would like,” he said. Relative to larger regions, the concentration of solar resources in New England makes the region “very prone to small deviations in the weather in terms of cloud cover.”

ISO-NE continues to see between 700 and 1,000 MW of solar nameplate capacity added annually. It currently has about 8.5 GW of BTM solar and about 1 GW of front-of-meter solar.

George said predicting cloud cover and solar output is a complicated and “evolving science.” He added that ISO-NE is communicating with other RTOs and working with vendors to improve the accuracy of its forecasts.

Capacity Auction Reforms Update

Also at the meeting, ISO-NE CEO Vamsi Chadalavada fielded questions from NEPOOL members about key projects, with multiple participants expressing concern about whether the RTO will be able to build adequate support for the second phase of its Capacity Auction Reforms (CAR) project.

FERC recently approved the first phase of the CAR project, which centers around implementing a prompt capacity market and resource deactivation reforms. (See FERC Approves ISO-NE Prompt Capacity Market.)

While the first phase received strong support from stakeholders, the second phase, focused on accreditation reforms and seasonal market changes, likely will be more controversial. Initial impact analysis results presented by ISO-NE in March elicited strong reactions. (See ISO-NE Details Initial Forecast of Capacity Auction Reforms’ Effects.)

Chadalavada emphasized the importance of building consensus and said ISO-NE aims to be open to different approaches while remaining committed to the core design concepts and planned timeline.

“There will be no delay … we have to keep our schedule,” he said. The RTO intends for both phases of the CAR project to take effect for the 2028/29 capacity commitment period, though this will depend on obtaining timely approval from FERC of the second phase.

Changes to GIS Accounts

Lastly, the Participants Committee voted to support changes proposed by Vistra to the NEPOOL Generation Information System (GIS) to allow one GIS login to access multiple GIS accounts. The GIS administrator estimated the changes would cost $186,660.

Some stakeholders expressed concern that the changes would benefit only a small number of participants. The proposal passed despite opposition from the transmission sector and some end users.