The California Public Utilities Commission wants CAISO to come up with a way to pause settlements of certain congestion revenue allocations in the ISO’s upcoming Extended Day-Ahead Market if participants begin to game the market through extensive self-scheduling.
If such “rampant pervasive behavior” appears, CAISO should consider reverting to using the ISO’s prior settlement methodology, the CPUC’s Energy Division said in January comments submitted to an ISO EDAM working group.
The congestion revenue allocation issue, specifically in situations of parallel flow on the electric system, was CAISO’s top priority last year. CAISO approved a new methodology to address the concern in June 2025, and FERC approved the methodology two months later. (See CAISO’s EDAM Scores Simultaneous Wins at FERC.)
Under the new methodology, certain congestion revenues stemming from parallel flows will be allocated to the BAA where the energy is scheduled rather than where the constraint is located — the previous methodology. Those revenues will be allocated based on a transmission customer’s eligible firm Open Access Transmission Tariff transmission rights submitted and cleared as day-ahead balanced self-schedules.
However, the new methodology will maintain a “suspected underfunding problem for the immediate future if other BAAs decide to self-schedule their bids in order to receive this congestion revenue,” the CPUC said.
“The expansion of the day-ahead market should not come at the expense of California ratepayers, who have invested millions, if not billions of dollars, into building a reliable grid,” the CPUC said. “Therefore, rather than allocating away congestion revenue tied to parallel flows to the BAA causing the parallel flow, any long-term CRA methodology should return that congestion revenue to the BAA in which the constraint occurred.”
Although the new allocation methodology has flaws, it was necessary to implement “as a stopgap measure for EDAM go-live to occur on time,” the CPUC added.
The CPUC recommended CAISO build out a “stop the brakes” mechanism, such as a pause in settlements, if the new methodology starts to show signs of gaming.
The CPUC asked also CAISO to confirm the ISO’s settlements system will be able to break out the congestion revenue tied to a parallel flow that crosses multiple BAAs. This potential issue will not be a concern when EDAM launches with PacifiCorp as its first participant in May 2026, but the ability to break apart congestion revenues will become more important when Portland General Electric and other entities join the market later in the year and in 2027, the CPUC said.
The agency asked also CAISO to clarify how it is treating congestion revenue tied to parallel flows caused by flows from another market, such as Markets+.
“Is this congestion revenue assigned to the BAA in which the constraint occurs, or is this congestion revenue returned to the market participant in the other market?” the CPUC said. “If the latter, has CAISO initiated these conversations with Markets+? Or does EDAM simply keep these congestion revenues?”
EDAM Benefits Approach Drafted
Separately, CAISO on Jan. 20 published its draft methodology for how the ISO and EDAM participants will estimate EDAM’s gross economic benefits.
The draft proposes to calculate EDAM benefits based on production cost savings in the electric system with EDAM versus the cost of the system without EDAM.
For hydroelectric resources, the EDAM benefit methodology will use an adjusted bid value to calculate production costs of the resource. This is because hydroelectric market bids might include both the value of water and certain external limits, CAISO says. Some of these external limits include FERC minimum flow requirements, recreational reservoir levels and forecast reservoir level targets, the draft says.
Estimating EDAM benefits does not require additional market tools or external data sources, and EDAM participants are not required to submit more data than what they already submit in a market run, the report says.
CAISO plans to finalize the benefits calculation methodology in the first quarter of 2026 before EDAM opens in May.
Members of NERC’s Standards Committee agreed to post two closely related standard authorization requests for stakeholder comment in their first gathering of 2026, along with several other standards items.
Meeting via teleconference Jan. 21, the SC also approved the members of its Executive Committee, which includes the SC’s chair and vice chair by default — Todd Bennett of Associated Electric Cooperative Inc. and Troy Brumfield of American Transmission Co., respectively — and three other SC members, each representing a different industry segment from each other or the officers. Its role is to help set the agenda for the monthly meetings and to conduct urgent committee business as needed.
To fill out the EC for 2026, members elected Patti Metro of the National Rural Electric Cooperative Association, Terri Pyle of Oklahoma Gas & Electric and Venona Greaff of OxyChem Power.
Following the EC election, members discussed the two SARs under consideration for posting. Both propose to revise reliability standard PRC-006-5 (Automatic underfrequency load shedding) to account for the effect of distributed energy resources on UFLS programs.
One SAR, developed by NERC’s System Planning Impacts from DERs Working Group and endorsed by the Reliability and Security Technical Committee, aims to “mitigate the risk posed by various interpretations of the imbalance equation in PRC-005-6 [and] by DER tripping as a cause of adjacent UFLS relay action on armed feeders.” (See NERC RSTC Tackles Priority Projects in Quarterly Meeting.) The other is intended to “improve visibility of current [UFLS] effectiveness” and was reviewed but not endorsed by the RSTC.
After NERC Manager of Standards Development Sandhya Madan presented both SARs, Steve Rueckert, director of standards at WECC, asked whether the ERO was proposing for them to be handled by the same standard drafting team or separate teams. Madan explained that NERC would wait to see “what kind of comments we get from the industry” before deciding that; in response to a question from Exelon’s Claudine Fritz, Madan confirmed that the SC would have the final decision on whether to assign the SARs to the same team.
Following the discussion, members voted to approve both SARs for posting. Madan confirmed they will be posted for a simultaneous 30-day informal comment period; the start date has not been determined. Both SARs will also be considered low-priority projects.
Members next took up two proposals to appoint a chair, vice chair and team members to separate drafting teams. The first item concerned Project 2025-05 (Ride-through revisions), meant to modify PRC-029-1 (Frequency and voltage ride-through requirements for inverter-based resources) to account for IBRs equipped with choppers — equipment that protects offshore wind projects during grid faults.
NERC Manager of Standards Development Alison Oswald reminded attendees that the project is the subject of a FERC deadline mandating that the ERO file a new standard by Aug. 28. (See FERC Approves IBR Ride-through Standards.) The SC voted unanimously to approve the slate of candidates as proposed by the ERO.
However, committee members hit a snag on the next item, a proposal to appoint five supplemental candidates to the drafting team for Project 2022-05 (Modifications to CIP-008 reporting threshold). This item was held over from the committee’s previous meeting in December after members expressed confusion that the background material they were given about multiple candidates did not match their oral descriptions given in the meeting. Rather than try to sort the confusion out during the meeting, members had agreed to bring the item back in January.
This time, Fritz and Brandon Weese of American Electric Power observed that two separate candidates had mismatched information in the background material provided by NERC — specifically, the regions given for both varied in different sections of their biographies. To ensure “a well-balanced drafting team that’s transparent,” Bennett suggested postponing action for another month, observing that this project is also low priority. NERC Manager of Standards Development Jordan Mallory, who was presenting the proposal, agreed.
The final standards action, also presented by Mallory, involved updating BAL-007-1 to capitalize the first letters of the term “Near-Term Energy Reliability Assessments.” Because this change “would have no material impact on the end users of the” standard, it does not require the full standards drafting process. Members voted unanimously to approve the update.
The average U.S. consumer would have spent $6,000 more on utility bills over the past decade without national efficiency standards for appliances, according to a report from the Appliance Standards Awareness Project (ASAP).
The group released the report Jan. 21 as the U.S. House of Representatives was due to vote on H.R. 4626, which would let the Department of Energy unwind existing standards more easily while making it harder to update new ones in the future. Appliance standards have become part of the culture war in recent years, with new ones much more likely to be issued when a Democrat is in the White House, while Republicans largely oppose them.
“These standards have kept utility bills far lower than they would have been,” ASAP Deputy Director Joanna Mauer said in a statement. “If the efforts from Congress and the administration to weaken the standards succeed, families and businesses could see significant increases in costs. Rollbacks are completely misguided, especially at a time when bills are already unaffordable for many people.”
H.R. 4626 — dubbed the Don’t Mess with My Home Appliances Act — was introduced by Rep. Rick Allen (R-Ga.) and cleared the House Energy and Commerce Committee in December.
“Over the last several years, DOE has gone beyond its scope of statutory authority by finalizing rules that do not meet the specific statutory criteria,” Allen said in a statement. “These egregious appliance standards have caused homeowners to spend 34% more on appliances than they did 15 years ago, while having to replace them at a faster rate.”
According to Allen’s office, the bill would eliminate the requirement for DOE to review and update standards every six years, establish a new process that would allow them to be reviewed and amend the criteria for determining whether a new standard is justified.
“The utility bill savings from more efficient appliances and equipment significantly outweigh any increase in purchase price,” the report said. “For all existing standards finalized by DOE since 2008, we estimate that the savings outweigh the costs by more than a factor of three.”
Since the first standards were established, refrigerators have used 50% less electricity, air conditioners 40% and lightbulbs 85%. Additional savings are possible, with ASAP’s report saying air conditioners could be 10 to 15% more efficient, and heat pumps are available that beat 2024 efficiency standards by 20 to 40%.
As part of a broader regulatory rollback, DOE announced plans last May to eliminate or reduce 47 “burdensome and costly” standards.
“Any actions that roll back existing standards or threaten DOE’s ability to set improved ones would raise costs for consumers and businesses and increase electricity demand,” ASAP says in its report.
The report found that businesses across the country would have spent $330 billion more on utility bills over the last decade, while households would have spent $780 billion more.
“In many parts of the country, electricity prices have risen faster than inflation in recent years due to factors including utility investments in transmission and distribution infrastructure, extreme weather and wildfires, and natural gas price fluctuations,” the report says. “A recent analysis found that rising energy bills are driving more households deeper into debt, with the average overdue balance on utility bills increasing by 32% between 2022 and 2025.”
While load growth has returned to being the major focus for the industry, the report notes that between the mid-2000s and 2021, it was relatively flat, and part of the reason was the federal efficiency standards for appliances.
“Absent existing efficiency standards, total U.S. electricity consumption would have been 14% higher in 2025,” the report says. “Summer peak electricity demand would have been 115 GW higher — roughly double the power demand of all data centers in the United States in 2025.”
Power consumption is forecast to grow 25% by 2030 compared to 2023, while peak demand could rise 14%. Any weakening of efficiency standards would increase electricity consumption and peak demand when the grid is already stressed, ASAP said.
Data centers bring new regional planning challenges for the Northwest Power and Conservation Council’s upcoming power plan, the organization said during a recent meeting.
For NWPCC, data centers will guide part of the resource recommendations in the council’s upcoming Ninth Power Plan. The council is required to develop a plan for the region and the Bonneville Power Administration under the Northwest Power Act of 1980 “to ensure an adequate, efficient, economical and reliable power supply for the region.”
“Data centers are part of our regional load growth,” Jennifer Light, director of power planning at NWPCC, said during the meeting. “We have to grapple with it. It is part of the regional plan that we’re going to need to plan for. And so we will be needing to address this large load in our plan in some way, in terms of resource recommendations, as well as other supporting recommendations.”
A challenge in planning for data centers is how much projected load will materialize, Light noted.
One way in which NWPCC plans for the forecasting uncertainty in the upcoming plan is by considering three different trajectories of data center load growth: low, mid and high.
The low forecast predicts a slowing down of current trends, the mid forecast is a continuation of trends, and the high forecast reflects utilities and BPA’s growth expectations, according to NWPCC’s presentation slides.
Under the high forecast, energy use is expected to reach more than 6,000 aMW by 2045, according to the slides.
“We want to make sure we have a robust strategy for the region and for Bonneville regardless of where this data center load comes,” Light said.
Under the Power Act, data centers are considered new large single loads and will receive different treatment by BPA. Data center loads are not eligible for BPA’s preference rates, such as Tier 1 or Tier 2 rates, according to NWPCC’s presentation slides.
Tier 1 “non-slice” contracts represent most of BPA’s power sales. “Non-slice” refers to a type of contract in which the customer is guaranteed a specified volume of energy regardless of conditions on the hydro system; in contrast, total volumes delivered to “slice” customers can vary based on availability. (See BPA Triggers $40M Surcharge Following Low Water Years.)
If requested by BPA’s utility customers, the agency can sell power to data centers under the New Resource Rate, but the timeline from request to delivery requires a study that can take multiple years to ensure BPA can find and transmit power to serve those large loads.
BPA’s current utility customers have long-term contracts that guarantee their access to BPA’s existing system. Any power BPA provides under the New Resource Rate would be based on the much higher cost of new acquisitions, costs that would also be available from other providers, likely with more flexible terms than BPA can provide based on the agency’s statutes.
“I do not expect much, if any, of this data center load growth to go onto the Bonneville’s obligation,” Light said.
“This doesn’t mean Bonneville might not have actions they can do in support of regional efforts,” Light said. “But for the obligation piece, the data center load is probably not going to Bonneville, at least the vast majority of it.”
Looming Shortfalls
However, NWPCC will need to address data centers in the broader regional strategy and focus on cost-effective resources to meet load growth and other additional recommendations, such as siting considerations, resource sharing and transmission constraints, Light noted.
The discussion follows a September 2025 study on Northwest resource adequacy by Environmental and Energy Economics that found “accelerated load growth and continued retirements create a resource gap beginning in 2026 and growing to 9 GW by 2030” and that “load growth and retirements mean the region faces a power supply shortfall in 2026.” (See 9-GW Power Gap Looms over Northwest, Co-op Warns.)
In an effort to address the costs associated with data centers, Oregon lawmakers passed House Bill 3546 to create a separate customer category for large energy users, such as data centers, and require those users to pay a proportionate share of their infrastructure and energy costs. Governor Tina Kotek signed the bill into law in June 2025. (See Oregon House Passes Bill to Shift Energy Costs onto Data Centers.)
The law defines a large energy use facility as one that uses more than 20 MW. It applies only to Oregon’s investor-owned utilities.
On Jan. 20, Kotek launched a statewide Data Center Advisory Committee, tasked with developing policy recommendations to address challenges associated with the growth of data centers across Oregon. (See Oregon Gov. Appoints Group to Address Data Center Growth.)
The latest in a series of Union of Concerned Scientists (UCS) reports on the costs of the AI boom asserts that powering U.S. data centers with clean energy would avert trillions of dollars in health and environmental costs.
The report, issued Jan. 21, warns about the public potentially paying twice for the massive data center buildout many observers expect — first to cover the cost of new grid infrastructure, then a second time for the negative impacts of that infrastructure if it relies heavily on fossil fuels.
“Overall, our modeling demonstrates that clean and renewable energy can meet the challenge of load growth from data centers, but policymakers must be proactive to protect our health, environment and financial interests,” Director of Energy Research and Analysis Steve Clemmer said in a news release announcing the study.
“Data Center Power Play: How Clean Energy Can Meet Rising Electricity Demand While Delivering Climate and Health Benefits” does not present green power generation as a direct cost savings — decarbonizing the power sector would instead increase U.S. wholesale electricity costs $412 billion or 7%, it says.
Restoring federal clean energy tax credits to the levels in the Inflation Reduction Act would shift the cost off ratepayers and lower electricity costs by $248 billion or 4%, the authors write.
Both figures would be dwarfed by the $8 trillion to $13 trillion in cumulative global climate benefits and the $120 billion to $220 billion in cumulative health benefits expected to result from decarbonization by 2050, the authors write.
Data centers’ impact on I.S. health care costs is projected in 2035 and 2050 under different growth and policy scenarios. | Union of Concerned Scientists
That is a tradeoff the present team of federal policymakers appear unlikely to make, but the report also calls for state policymakers to take action to avoid financial, health and environmental impacts harms associated with unchecked growth of data centers.
The analysis looks at three scenarios: the current policy landscape created by the Trump administration and its allies in Congress; restoration of electricity tax credit provisions of the Inflation Reduction Act; and a national effort to reduce the power sector’s carbon-dioxide emissions 95% by 2050. It uses a midlevel assumption for data center demand growth but adds a no-demand growth comparison to isolate the impacts of data centers.
The study estimates that:
Data center demand could increase from 31 GW in 2023 to 78 GW in 2030 and 140 GW in 2050.
More than 90 GW of new gas-fired capacity would be added by 2035 and 335 GW by 2050 under Trump administration policies.
Coal-fired generation as a percentage of the whole would decrease by 2035 and 2050 under all three scenarios.
Illinois, Michigan, Wisconsin
The report is accompanied by a technical appendix, as well as breakouts drilling down on projected impacts in three Upper Midwest states.
“Looking collectively at our findings for Illinois, Michigan and Wisconsin, it’s clear that strong, foundational state clean energy policies are helpful for confronting the large — yet highly uncertain — data center-driven growth in electricity demand,” said James Gignac, UCS Midwest policy director. “But without careful, specific attention by state policymakers and regulators to data centers, the rapid rise in the need for power leads to increased costs and pollution.”
UCS noted Illinois has strong power sector decarbonization policies but said more is needed, because without further policy protections, data center-driven load growth could put Illinois at risk of $24 billion to $37 billion in additional electricity system costs.
It said data centers could account for up to 72% of electricity demand growth in the state by 2030. But current policies will increase the use of in-state fossil fuel plants and increase reliance on out-of-state generation, UCS said.
Michigan too has enacted significant clean energy legislation in the 2020s, UCS said, but lawmakers left a loophole: The restrictions apply only to electricity sold within the state, not to energy generated in-state but exported to other states.
UCS recommended that Michigan close that loophole.
It also said Michigan’s electricity demand could nearly double by 2050, with data centers accounting for up to 38% of the increase, but said those estimates were highly speculative. It called for greater transparency from data center developers and flexible utility planning.
UCS said Wisconsin does not have comprehensive clean energy policies in place but is experiencing a data center development boom. Current policies would prompt large increases in natural gas generation with accompanying increases in carbon emissions.
The generation technology mix expected to power U.S. data centers is shown under three policy scenarios. | Union of Concerned Scientists
UCS recommended that Wisconsin commit to clean energy policies now; adopt ratepayer protections that require data centers to bring their own clean energy generation; be transparent about decision making; and impose integrated resource planning requirements on utilities.
It did note movement in Wisconsin’s statehouse in 2025 on measures intended to reduce economywide carbon emissions and protect consumers from the costs and carbon emissions of new data centers.
As of 2023, coal- and natural gas-fired power plants provided 75% of Wisconsin’s in-state generation, UCS said.
Methodology
The UCS analysis uses the Regional Energy Deployment System, a least-cost electricity planning and dispatch model developed by what was then known as the National Renewable Energy Laboratory, and it relies on electricity demand projections developed by Evolved Energy Research for its Annual Decarbonization Perspective 2024 report.
The concept of flexible data center demand — recently gaining much attention as a means of reducing peak load and reducing the need for new infrastructure — was not included in the analysis, nor were impacts of recent market and policy changes, such as new tariffs, rising gas turbine costs, natural gas price volatility and federal road blocks to renewable power development.
New nuclear capacity — another priority of the Trump administration — was not modeled because of its high cost. Also not modeled was Big Tech’s interest in paying above-market prices to restart or uprate existing nuclear plants, or in building new reactors to power data centers.
Finally, the authors note that the number of data centers to be built and the amount of electricity they will consume are both highly uncertain. Also unknown are technology advances that may change the energy profile of future data centers.
Oregon Gov. Tina Kotek has appointed a new committee intended to help address the effects of the rapid growth of new data centers in the state — with a particular focus on the electricity system.
The new Data Center Advisory Committee will be “tasked with developing a set of policy recommendations and actions to address issues of statewide significance associated” with that growth, according to a Jan. 20 press release from the governor’s office.
“Oregonians have made their concerns about rising utility bills clear. As our state faces rapid growth of data facilities, we must have frank conversations about the challenges and opportunities ahead,” Kotek said in a statement. “I expect the Data Center Advisory Committee to help ensure economic growth while protecting affordable power and Oregon’s critical water resources.”
The committee’s overarching goal: to come up with recommendations that help Oregon “take strategic advantage of the economic development opportunity” created by new data centers and other large industrial consumers of electricity “while striving to keep utility costs, infrastructure upgrades and environmental impacts sustainable for all Oregonians.”
The recommendations are due to the governor by October 2026.
The release reiterates Kotek’s phrasing, saying the committee will engage in a public process to understand the “challenges and opportunities” related to siting data centers and will develop recommendations that “support responsible economic development, create jobs and increase long-term revenue that will strengthen our rural communities,” which are the site of many large data center operations in Oregon.
The committee is also tasked with exploring how data center development affects — and can help — Oregon’s efforts to meet its climate, clean energy and natural resource management goals, including those related to water use. The press release points to a potential competition for water between agricultural users and data centers in rural areas.
The recommendations should also help the state “ensure data centers have reliable energy without burdening Oregon’s ratepayers,” the release said.
Kotek appointed as committee co-chairs Margaret Hoffmann, a member of the Northwest Power and Conservation Council, and Michael Jung, executive director of the ICF Climate Center.
“The challenges we currently face are complex,” Hoffmann said in a statement. “I look forward to working with my fellow committee members to understand how we can co-create a vision for Oregon that supports healthy economic development, affordable energy, natural resource abundance and a future in which all Oregonians can thrive.”
“To have been tapped by Oregon Gov. Tina Kotek to serve as a co-chair of the Data Center Advisory Committee is an honor that I humbly accept as a citizen volunteer,” Jung said. “The governor has assembled an experienced committee to recommend priorities and actions to chart a path that balances existing priorities and new opportunities.”
Other committee members include:
Dan Dorran, Umatilla County Commission chair;
Greg Dotson, associate professor and leader of the Energy Law and Policy Project at the University of Oregon;
Bill Edmonds, adjunct professor at the University of Portland;
Tim Miller, director of Oregon Business for Climate; and
Jean Wilson, operating partner at Sandbrook Capital.
Oregon lawmakers in 2025 passed a law directing the state’s Public Utility Commission to create a new retail rate class for large electricity consumers such as data centers in an attempt to shield residential ratepayers from the costs incurred to integrate those loads. (See Oregon Governor Signs Bill to Create Data Center Rate Class.)
Less than five years ago, California’s Lake Oroville was so empty its 644-MW Edward Hyatt Power Plant, a pumped-storage hydroelectric plant, was shut off for the first time since it came online in 1967. The state was in a drought so long and severe that many areas had water restrictions, most reservoirs were at or near record lows, nearly 400,000 acres of farmland were idled, and close to $1 billion worth of crops were lost.
Almond trees in general, and the billionaire couple who own Wonderful Foods specifically, found themselves on the public enemies list again after a 2016 Mother Jones expose uncovered their substantial water use. But neither almonds nor pomegranate juice — nor even billionaires — were mostly to blame. La Niña bore the brunt of the blame for the drought that gripped the West, and climate change exacerbated it.
The hydropower shutoff was just one of the energy-related impacts of the drought. Throughout this extreme dry spell, the water-energy nexus was laid bare.
Today, we need to think of drought as more than an agricultural or wildfire-risk problem; it’s a systemic threat to the electric grid. Drought, like other weather extremes, undermines supply, drives up costs and exposes weaknesses in our infrastructure planning.
When it Rains, it Pours (And When it Doesn’t…)
The irony of researching this article the same month California was declared drought-free for the first time in 25 years is not lost on me. When I returned to Northern California from my holiday break, it was raining. Hard. With the 101 freeway closed in both directions thanks to the storm coinciding with a king tide, my last piece on sea level rise seemed relevant. But drought? It was far from top of mind.
Compared to other climate extremes covered in this series, I assumed drought’s impact on the grid would be both obvious and tangential. More drought means less hydropower generated. Hardly a story. But digging deeper, it’s clear that drought should be thought of as a problem multiplier when it comes to our energy system.
As climate change progresses, we have to build a grid that can handle heat waves, wildfires and extreme drought, as well as extreme precipitation and sea level rise. It’s a complex challenge that will get more urgent as climate swings become more extreme.
The Tangled Water-energy Nexus
The intersection of water and energy is, to put it mildly, complex.
Water is used to produce energy directly (hydropower), is heated to produce steam to produce energy (thermoelectric plants with steam turbines), cools that steam as it leaves the turbine and is pumped underground to capture geothermal energy. Water also acts as a battery in pumped-storage hydroelectric plants, which pump it up during off-peak hours for later release.
As renewables have produced a larger portion of the electricity supply, the dependence on water-cooled thermoelectric plants has declined. | EIA
Water also consumes energy when it’s pumped up from aquifers, conveyed from one location to another, treated so it’s potable and heated for those long showers we love. In California, those uses consume at least 12% (possibly as much as 19%) of all electricity on the grid. Nationally, it’s lower, with 4% of total electricity generated used for drinking and wastewater services.
Drought affects the physical grid, too. When aquifers are pumped out, the ground above can subside as the pockets of ground that had held water collapse. In California’s San Joaquin Valley, decades of drought-driven aquifer pumping have caused land to sink by a foot a year, damaging roads, pipelines and overhead utility infrastructure.
Then there are the less obvious intersections. Water conveys inputs for the energy system, such as coal barges on the Mississippi: 11% of all coal used by power plants is delivered by barge, and under this administration, coal-fired plants are being revived. And the correlation between rising electricity demand and water demand is high in the booming data center sector, creating stress on both systems.
Drought as a Problem Multiplier
With water woven so tightly into the energy system, drought becomes an electricity problem too.
The most obvious impact of drought is a decline in hydroelectric output. Conventional hydroelectric plants in the U.S. contribute about 6% (240,000 GWh) of total U.S. utility-scale electricity generation. Pumped-storage hydro adds 23 GW of storage capacity.
For hydro asset owners, drought has a real cost: A 2024 study found the sector lost 300 GWh of production and $28 billion of revenue over the 2003-2020 period. The period following the study saw droughts worsen. In the 2022-23 water year, Western U.S. hydropower output was the lowest since 2001.
Drought also has cross-border trade implications. The U.S. typically imports electricity from Canada, where hydroelectric plants generate more than 60% of the electricity. In September 2023, drought in Canada reversed that flow, making Canada a net importer for five of the following nine months. While Canadian imports account for less than 1% of U.S. electricity, the exchange plays an important role in grid balancing.
The Drought-demand Spiral
Drought increases energy demand. In the agricultural sector, there’s more energy used to pump water from the aquifer to water crops. Despite a water shortage, some end users, such as golf courses, will increase their water use, meaning more water is treated and pumped before it reaches the green. Droughts often coincide with hotter weather, when the demand for air conditioning rises.
Drought impacts U.S. and Canada’s energy trade. Normally, Canada is a net exporter due to its strong hydro sector, but a drought changed the direction of the electric flow as Canada’s generation fell. | EIA
Droughts also increase the risk of wildfires, with bone-dry vegetation more vulnerable to any spark from the grid. And even if the grid’s not the cause of the fire, fighting those fires draws on water supply. The devastating Palisades fires a year ago were exacerbated when electric utilities de-energized lines, leaving water utilities unable to pump enough water to keep the fire hydrants at full pressure. Even without the outage, the unprecedented demand on the hydrant system would have been almost impossible to meet, exposing the need to provide battery backup to critical infrastructure, including water pumps.
Water as a Power Plant Input
It’s a misnomer to say energy is one of the largest users of water in the same way that it’s a misnomer to say wind farms take up massive amounts of farmland. Water used in energy production largely continues its sea-bound journey after use, though warmer than before, in the same way farmlands continue being productive as cattle graze under the wind towers.
Power plants that use energy to cool steam are impacted by drought only when there’s not enough water to intake. Thermoelectric power plants, including coal, natural gas, nuclear, oil and biomass, are becoming both a smaller part of the nation’s electricity supply and more water-efficient. Wind and solar generators have grown from 4% of total utility-scale generating capacity in 2010 to 18% by 2021, so the portion of our power system dependent on water has fallen.
Still, thermoelectric power plants, almost all of which depend on water, provide about three-quarters of the electricity on the grid. They cool the steam from their turbines in one of three ways: once-through, where water is taken from rivers, lakes and aquifers and released back hotter; closed-loop or wet-recirculating, which reuses the water once it’s literally let off some steam in cooling towers; or dry-cooling, which uses air to cool the steam.
Most of the water withdrawn by once-through systems is discharged back into the place it came, not contributing to the drought; however, they can be impacted by drought if the river or dam they draw from is depleted, or is too low to permit the volume of warmed water to re-enter the natural water system. In 2022, the Jim Bridger coal-fired power plant in Wyoming was at risk of being shut off as the Green River it draws from ran low.
Closed-loop thermoelectric plants are less affected by drought but draw more water from the system to replenish the amount that evaporates. Dry-cooling plants, which are relatively rare, use the least water, but at the cost of power plant efficiency.
The EIA reports that thermoelectric plants are becoming more efficient in their water use. “The sector’s water-withdrawal intensity — the amount of water withdrawn per unit of electricity generated — continued to fall, declining 2.1% from 11,849 gallons/MWh in 2020 to 11,595 gal/MWh in 2021.” Pushing against that trend, the rise in data center energy demand may increase the power sector’s total water demand even if the gallons/MWh declines.
Moving Toward a Lower-water Grid
Of course, the easiest way to reduce the grid’s reliance on water is to adopt generation technologies that require little or no water. Solar and wind are obvious candidates, but some types of geothermal and combined heat and power (CHP) need little to no water.
Geothermal technologies’ water needs vary, with binary cycle power plants’ closed loop systems not requiring any aside from water used in the initial drilling process. Some of those that use water, such as Fervo Energy, can use degraded or brackish water that could not be used for agricultural or other purposes.
CHP systems create efficiencies by capturing the energy from the power plant’s steam to provide heating, hot water or chilled water for facilities. They are highly water-efficient, though they’re generally limited to smaller power plants co-located with facilities or areas with district heating.
Policy that Prepares for Drought
Regulators, operators and asset owners need to prepare for a drier (and hotter, and wetter, and stormier) future. And that begins with assessing risk.
The West well understands the impact of droughts after the past few years of real-life experience. In other areas, even those that haven’t had droughts in the past, modeling potential droughts’ impact on reliability and reserve margins is important if the industry is to prepare for the future.
One example: a study of the PJM and SERC region’s generation capacity found that if the area suffered a drought equal to “the 2007 Southeastern summer drought … the usable capacity of all at-risk power plants may experience a substantial decrease compared to a typical summer, falling within the range of 71 to 81%.”
The energy-water nexus must be considered in energy policy and pricing: low electricity tariffs have made it affordable for farmers to pump scarce groundwater from aquifers that do not recharge as quickly as they are being drawn down, especially during periods of drought. And there’s untapped potential, pun intended, for incentivizing farmers to pump irrigation and well water at off-peak times or connect automated irrigation pumps to demand-response programs.
The question is whether utilities have any incentive to support policies that will cut the energy used by the water system. Until policies reward water-efficient moves by the energy industry, moving all of that water will continue to consume much of the electricity we produce, and keep the industry vulnerable to droughts.
Different Problem; Similar Solution
As with floods, fires and other extreme events exacerbated by climate change, preparing for drought requires building a more flexible and resilient grid. Islandable microgrids, more energy storage, stronger infrastructure and diversified generation sources all help stabilize the grid, whether facing a long-duration challenge like a drought or an immediate emergency like a flood.
Extreme weather events require thinking, and collaborating, outside the electric box: no single industry can prepare alone and the sooner that states and regions put together integrated plans for these climate extremes that include all of the energy system players along with those in charge of water, transportation and every other piece of critical infrastructure, the better able we’ll be to cope with the next extreme weather event.
A report from the American Clean Power Association (ACP) argues that slowing down renewable development in PJM could cost ratepayers $360 billion over the next decade.
The analysis, released Jan. 21, compared a base case assuming wind, solar and storage development follows current expectations and reaches 137 GW of nameplate capacity by 2035 with a scenario in which only projects already under construction or legally mandated are built. With the amount of growth expected in PJM’s 2025 Load Forecast, the report finds that without that renewable buildout, the RTO will increasingly rely on aging fossil fuel resources and imports, dispatching of which would increase by 20% and 292%, respectively.
West Virginia would see the largest residential rate increase over the next decade at $8,500 for a typical customer, followed by Ohio and Pennsylvania at $6,500 and $6,400. Illinois and D.C. would be lowest at $3,200 and $2,900.
“These findings make clear that delaying clean energy deployment comes at a steep cost,” Senior Vice President of Markets and Policy Analysis John Hensley said in a statement. “Timely investment in wind, solar and energy storage is essential to maintaining reliability, reducing dependence on imports, and protecting families and businesses from sharply higher electricity bills as demand continues to grow.”
Hensley told RTO Insider the impact of a slowdown in renewable development would come in three areas: rising rates, diminished reliability, and the economic impact of data centers and manufacturing facilities siting outside of PJM as a result.
While 50 GW of new gas generation are included in the analysis, the report says that efforts to push for more resources would quickly lead to higher costs so long as turbine availability remains constrained.
“This reliance on imports and gas peaking units increases exposure to fuel price volatility, drives more high-priced hours, and heightens reliability risks during peak demand periods,” the report says.
Hensley told RTO Insider he views gas as playing a role in meeting the reliability challenges posed by rapid load growth in the coming years, but there is a timing disconnect with how long those resources take to construct. Renewables have strong supply chains allowing for rapid construction.
He said the starting point for efforts to bring more generation online should be a non-discriminatory approach that recognizes the contributions of all technologies. He pointed to PJM’s effective load-carrying capability model for determining the capacity contribution for different resource classes.
The No Clean Power scenario assumes states end their renewable portfolio standards and no renewable energy credits are available, while the base case includes tax credits being available in full for wind and solar through 2030 and for storage through 2032. Hensley said the base case assumptions about renewable development were based on projects in PJM’s interconnection queue, an ACP database of renewable projects, Energy Information Administration data and projections from organizations such as Bloomberg and S&P Global.
PJM’s 2026 Load Forecast tamped down the expected growth over the next five years, though peaks are still expected to increase from 160 GW in 2027 to 191 GW by 2031. By 2046 the summer peak is expected to reach 253 GW. (See Pessimistic PJM Slightly Decreases Load Forecast.)
The RTO’s two transition cycle queues include 1,669 MW of wind, 7,051 MW of storage, 17,075 MW of solar, 1,503 MW of nuclear and 5,460 MW of gas, according to its planning webpage. There are 27,537 MW of solar under construction, as well as 3,876 MW of storage, 8,059 MW of wind, 5,796 MW of gas and 2,930 MW of hybrid resources.
AUSTIN, Texas — ERCOT says there is “broad agreement” from stakeholders that the grid operator’s batch-based approach for interconnecting large loads is necessary.
Jeff Billo, ERCOT vice president of interconnection and grid analysis, told the Texas Public Utility Commission during its Jan. 15 open meeting that it has only begun to engage stakeholders on the batch process, but a couple of themes have already stood out (59142).
“Everyone that we have talked to so far has been supportive of us moving to a batch study process and moving away from the current process,” Billo told commissioners. “I think one of the reasons is … that there is a lot of uncertainty in the current process. We have this issue today where loads go through the study process, and then something happens — maybe another load in their neighborhood moves forward and meets their financial commitment obligations and that load is not included in the other project study … and we’re kind of caught in this restudy loop for a lot of these projects.”
Other themes outlined by Billo included: uncertainty in the current process creating risk for developers of existing interconnection requests; transparency and consistency in the batch process; and aligning the process with ERCOT’s transmission-planning work.
ERCOT CEO Pablo Vegas unveiled the draft process in December, calling the wave of large loads looking to interconnect “fairly unprecedented.” The gird operator had 63 GW of interconnection requests from large loads at the end of 2024. That number has mushroomed to 232 GW as of January, according to staff’s latest data. (See ERCOT Again Revising Large Load Interconnection Process.)
With the batch process, ERCOT will group together large-load requests to be evaluated, rather than rely on the current individual studies that transmission service providers conduct. The batch studies will determine the amount of requested load that can be reliably served each year over a six-year period and the transmission upgrades needed to accommodate the full load requested.
The grid operator says a “Batch Zero Study” will likely be needed to transition from the current process, which was just documented in December by a rule change to the Planning Guide. That study will set a foundation and baseline for future studies, which could happen several times a year for several years.
Billo said the first batch study will break the cycle of restudies and “get those projects out” without creating a restudy loop or uncertainty. The first batch will take projects that are already under ERCOT review, currently totaling about 7.4 GW.
“We are still really early in the process of designing how that batch study would work, but we hope to bring more details on that in the coming weeks,” he said.
ERCOT staff plan to use its Large Load Working Group as an engagement forum with stakeholders, as well as updating the Technical Advisory Committee and PUC during their next regularly scheduled meetings. During a Jan. 21 discussion with TAC, Billo deferred most questions to a Feb. 3 workshop on the batch process.
“We’re going to get through everything that we need to get through that day,” Billo promised TAC members. “We will lay out as many details on that framework as we can … [understanding] that the framework will be in pencil. We want the stakeholder feedback.”
A second batch-process workshop is tentatively scheduled for Feb. 12.
“And then our homework is due to the commission,” Billo said, pointing to the PUC’s Feb. 20 open meeting.
Colorado regulators have declined to reconsider their decision finding that it would be in the public interest for Public Service Company of Colorado (PSCo) to join SPP’s Markets+.
The requests for reconsideration came from Western Resource Advocates, Advanced Energy United and Colorado Energy Consumers.
“After reviewing the [requests] filed by WRA, AEU and CEC, I still find that the majority’s initial decision is sufficiently supported within the record and based on policy considerations,” Blank said.
But the commission did agree to reverse its decision to direct PSCo to file an application to join an RTO or ISO, or request a waiver from doing so, by June 1, 2027 — two years earlier than the deadline set in commission rules.
State law requires electric utilities that own and control transmission facilities to join an organized wholesale market (OWM) by 2030. In its original decision, the commission had argued there was “a genuine potential for Public Service to conclude its efforts in organized wholesale market participation with SPP Markets+.” The earlier deadline would “help confirm that Public Service is moving towards its eventual participation in an OWM or is prepared to show why OWM participation is not in the public interest.”
But AEU argued in its reconsideration request that the earlier deadline would dramatically increase the odds that PSCo, an Xcel Energy subsidiary, would seek a waiver from RTO or ISO participation and that the waiver would be granted. Fewer data about the benefits of SPP’s RTO West, also known as the RTO Expansion, would be available then, AEU said, and alternatives available through the West-Wide Governance Pathways Initiative would likely be at an early stage.
Commissioners agreed and reset the deadline to June 1, 2029.
In another change to its earlier decision, the commission addressed concerns that PSCo would use the money spent on joining Markets+ as an argument against RTO participation. The commission directed PSCo to exclude sunk costs in the cost-benefit analysis of joining an RTO.
Public Interest Finding
Under commission regulations, transmission utilities that want to participate in a day-ahead market must demonstrate three things: The market must have protocols in place for greenhouse gas emissions tracking and accounting; it must have a plan to address seams issues with neighboring markets; and the expected benefits of joining the market must exceed costs, as shown by modeling and other analysis.
Parties that sought reconsideration said the GHG protocols and seams strategies for Markets+ are not fully developed. They said a Western Markets Exploratory Group study that analyzed costs and benefits was based on “flawed assumptions and outdated market footprint.”
WRA argued that the commission “should not make a public interest determination before requiring the company to evaluate participation in other markets,” including CAISO’s Extended Day Ahead Market.
Blank pointed to a previous commission determination that utility participation in an energy imbalance market, a day-ahead market, an RTO, a power pool or a joint tariff is generally in the public interest. The determination was based on a study commissioned to meet requirements of the Colorado Transmission Coordination Act of 2019.
But Gilman sided with the groups requesting reconsideration. She said she plans to again write a dissent to the commission’s decision.
“Plain and simple, the company failed to provide the evidence that met the public interest requirement in the rules,” she said.