October 31, 2024

‘Leaning’ Evident in BPA Response to NW Senators

CAISO’s adoption of the West-Wide Governance Pathways Initiative’s “Step 1” changes won’t overcome the Bonneville Power Administration’s objections to the governance of the ISO’s Extended Day-Ahead Market (EDAM), BPA Administrator John Hairston told U.S. senators from the Pacific Northwest.

“Our specific concern is that, with only Step 1 in place, the market governance remains under the ultimate authority of California,” Hairston wrote in an Aug. 21 letter to the senators, which has not yet been posted on the agency’s website.

Hairston’s comments were part of a broader response to a series of questions posed to him in a July 25 letter signed by Democratic Sens. Jeff Merkley (Ore.), Ron Wyden (Ore.), Maria Cantwell (Wash.) and Patty Murray (Wash.).

In their letter, the senators urged the agency to “act carefully and deliberately” before selecting a day-ahead market and to delay a “draft letter to the region” relaying its decision, previously slated for Aug. 29, until more developments play out around EDAM and SPP’s competing Markets+ offering. (See NW Senators Urge BPA to Delay Day-ahead Market Decision.)

The senators’ letter signaled a preference shared by many Western state officials, public interest groups and large energy users — and some utilities — that the region will benefit most from a single organized electricity market that includes CAISO.

It also expressed concern that BPA staff in April issued a “leaning” recommending the agency choose Markets+ over EDAM, citing the SPP market’s independent governance and overall design as primary factors supporting the opinion. The senators directed the agency to answer 14 detailed questions to clarify the reasons behind the leaning. (See BPA Staff Recommends Markets+ over EDAM.)

Inadvertently or not, the senators got one wish: In his Aug. 21 response, Hairston said BPA would delay its market decision until next year, an announcement it later would relay to its stakeholders on Aug. 25, saying both markets have “outstanding issues that require additional analysis.” (See BPA Postpones Day-ahead Market Decision Until 2025.)

But Hairston’s Aug. 21 response to the senators clearly — and understandably — shows the fingerprints of the staff that produced the leaning. It also evinces continued concerns among some parties in both the Northwest and Southwest about a market arrangement that could be dominated by California and its interests.

In response to the senators’ question about which market BPA expects “will provide the greatest improvement in grid reliability in the Northwest,” Hairston cites the benefit of the Markets+ requirement that its participating entities also participate in the Western Power Pool’s Western Resource Adequacy Program (WRAP).

“The EDAM proposal’s lack of a common resource adequacy metric makes it difficult to assess whether the market or its participants will be resource adequate in the planning horizon for the market,” Hairston wrote, adding that California’s “state-mandated” RA metrics don’t align with WRAP requirements and that EDAM will accept non-California participants that haven’t committed to the WRAP.

Responding to another question about which market would do more to reduce greenhouse gas emissions from the Northwest’s electricity sector, Hairston said Markets+ has made progress in developing GHG tracking and accounting procedures that would allow BPA’s customer base of publicly owned utilities to meet Washington’s cap-and-invest program obligations and Oregon’s “non-pricing” carbon requirements.

“Our continuing concern with CAISO’s EDAM design is that California is able to deem a disproportionate share of carbon-free market-traded resources as delivered to California, to the disadvantage of utilities in the Northwest and their ability to meet their state goals,” he wrote.

Addressing a question about the impact on the Northwest grid from “seams” between two different markets, Hairston cited BPA’s previous experience using the Coordinated Transmission Agreement with CAISO to enable several of the region’s utilities to use BPA’s transmission system to participate in the ISO’s Western Energy Imbalance Market before the agency itself joined that market.

“Bonneville expects to undertake a similar exercise if necessary to manage day-ahead market seams,” he said.

Governance Still Key

But the issue of CAISO’s state-run governance was front and center in Hairston’s response to the senators — just as in the staff leaning.

“Bonneville seeks to participate in a market that has a durable, effective and independent governance structure [that] provides fair representation to all market participants and stakeholders,” he wrote.

Hairston described the choice as being between Markets+, with its independent board of directors, and EDAM, which would fall under the “shared authority” of the Western Energy Markets (WEM) Governing Body and the ISO Board of Governors “appointed by the governor of California.”

Hairston acknowledged the progress made by the Pathways Initiative in forcing movement on CAISO’s governance. The ISO and WEM boards last month voted to approve the Pathways plan giving WEM officials “primary authority” over WEIM- and EDAM-related market-rule decisions. (See CAISO, WEM Boards Approve Pathways ‘Step 1’ Plan.)

But his response to the senators’ question about that effort illustrated his skepticism around whether the “Step 2” plan for advancing a California bill to grant the WEM Governing Body “sole authority” over the EDAM would get traction or meet BPA’s requirements.

“While we appreciate the Pathways’ sponsors optimism for a positive outcome, such efforts have repeatedly failed to secure legislative approval. It also remains to be determined what legislative conditions and constraints will continue to impede an independent governance structure,” he wrote.

Pathways backers expect to begin working with California lawmakers on a bill this fall after the conclusion of the current session. They hope to get the bill introduced and passed during the 2025 session, which starts in January.

That bill might not progress in alignment with BPA’s new day-ahead market decision timeline. The agency now plans to release its draft decision in March 2025, followed by a final decision in May.

AEU Webinar Highlights Potential Queue Improvements

Speeding up the interconnection queues is becoming more important as demand growth and the retirement of existing generators combine to cut into reserve margins around the U.S., experts said during a webinar Sept. 4 hosted by Advanced Energy United.

Grid Strategies President Rob Gramlich summarized a recent report his firm co-authored with the Brattle Group for United and the Solar and Storage Industries Institute called “Unlocking America’s Energy: How to Efficiently Connect New Generation to the Grid.”

The report’s recommendations would help developers of generation get more certainty from the interconnection process, which generators do not have when they enter the queues and put up deposits to save a place in line, Gramlich argued.

“The developers really need certainty, or else, if you don’t have it, you’ll continue to have this queue churn and projects coming in and out in order to get information,” Gramlich said.

After putting up their deposits, projects end up in a study process that is usually lengthy; then getting them built can be delayed because of required transmission upgrades, Gramlich said.

Supply chain issues are delaying both the construction of network upgrades and sometimes the generators themselves, with PJM reporting that nearly 40 GW of projects have made it through its queue but are not in operation yet.

“There’s no single answer to that, but there’s maybe a few common challenges,” said Hannah Muller, senior director of external and market affairs at Clearway Energy Group. “One is supply chain; that’s both for the actual project equipment, but also the transmission infrastructure. There’s just years [of] delay in getting necessary equipment; it’s just a function of the global economy at this point.”

Other issues include permitting and community opposition to new infrastructure; FERC and the RTOs can speed up the interconnection, but those are outside of their purview, Muller said.

FERC made changes to the baseline for queues with Order 2023, but the report and panelists argued that additional measures are needed. The commission is holding a two-day technical conference from Sept. 10 to 11 on that subject; Gramlich said it would be a good venue for helping to share best practices from the different regions.

In addition to certainty, the Grid Strategies report argues for quicker schedules and non-discrimination that guarantees a level playing field for “similarly situated interconnection customers.” Adopting an interconnection entry fee for proactively planned capacity would provide customers with significant interconnection cost certainty and address cost allocation of the upgrades identified. The report also suggests a fast-track process to use existing and already planned interconnection capacity that would prioritize the projects most ready to go live.

The interconnection study process also needs improvements to enable the fast-track process and make studies more efficient generally, the report said.

One promising way to improve the study process that is in its early stages is the use of artificial intelligence, said Kyle Davis, the Clean Energy Buyers Association’s senior director of federal affairs.

SPP, Amazon Web Services, Pearl Street [Technologies] and NextEra [Energy] are testing out Pearl Street’s SUGAR platform to try and help organize and bring that machine-learning process to the interconnection queue analysis,” Davis said.

Testing so far indicates the technology can whittle what has historically taken three months down to as little as an hour, he added. The report discusses the pilot effort, and Pearl Street’s CEO, David Bromberg, is a witness at FERC’s technical conference next week.

The final set of modifications that the report suggests is around speeding up the transmission construction backlog to address growing constraints that hinder network upgrades. The report noted that while it did not focus on proactive transmission planning, that is key to speeding up the queues, and several transmission providers are already implementing proactive planning, while others are developing long-term planning processes to comply with FERC Order 1920.

Speeding up construction also would mean addressing supply chain issues, with which the Department of Energy could help, Gramlich said. As for FERC jurisdictional issues, it is unclear why some utilities can get network upgrades built more quickly than others; the report suggests some independent monitoring of that stage to identify why that is happening and address issues that slow down construction, Gramlich added.

Demand is growing, but supply is out there that could address it, said R Street Institute Senior Fellow Beth Garza, who is also speaking at FERC’s technical conference next week.

“Those costs eventually are borne by consumers,” Garza said. “It’s consumers that are using the electricity. They’re the ones getting value out of having the electricity. … It’s the indirect costs that consumers absolutely bear by inefficient processes.”

DOE Set to Fund 2 Energy Storage Research Hubs

Teams led by Argonne National Laboratory and Stanford University are in line for $125 million to boost their research into next-generation energy storage.

The U.S. Department of Energy announced the funding Sept. 3. It said the two Energy Innovation Hubs will accelerate development of storage technology beyond lithium-ion batteries, with a priority on use of inexpensive and abundant materials.

The Energy Storage Research Alliance (ESRA) led by Argonne will focus on new compact batteries for heavy-duty transportation and grid-scale energy storage, while the Aqueous Battery Consortium (ABC) led by Stanford will work to establish the scientific foundation for large-scale development and deployment of aqueous batteries for long-duration grid storage technologies.

If finalized, the awards will be worth up to $62.5 million each and will extend up to five years.

In its own announcement, Argonne said:

“ESRA seeks to enable transformative discoveries in materials chemistry, gain a fundamental understanding of electrochemical phenomena at the atomic scale and lay the scientific foundations for breakthroughs in energy storage technologies.”

The goal is high-energy batteries that provide days of output, do not catch fire and have decades-long service lives. They will rely on abundant materials, which may mitigate the cost and supply chain volatility associated with present-day batteries.

ESRA Director Shirley Meng said: “To achieve this, energy storage technology must reach levels of unprecedented performance, surpassing the capabilities of current lithium-ion technology. The key to making these transformative leaps lies in a robust research and development initiative firmly grounded in basic science.”

DOE said both projects also will be a vehicle for workforce development and inclusion of diversity.

ESRA Deputy Director Wei Wang said: “Cultivating a diverse workforce dedicated to safeguarding America’s energy resilience is key to ESRA’s mission. Through our strategic equity and inclusion initiatives, we plan to create a robust training ground for energy storage science from the undergraduate to postdoctoral levels.”

Stanford in its announcement said the ABC seeks to overcome the limitations facing batteries that use water as their electrolyte.

ABC Director Yi Cui explained that enormous amounts of stationary energy storage will be needed to achieve net-zero greenhouse gas emissions, and said water is the only realistic solvent available at the cost and quantity needed.

He said: “How do we control charge transfer between solids and water from the molecular to the device scale and achieve reversibility with an efficiency of nearly 100%? We don’t know the solutions to those hard problems, but with the Department of Energy’s support we intend to find out.”

The aqueous battery concept is in extensive use in the starter systems of internal combustion vehicles, but those batteries are small and rely on a toxic substance — lead acid.

Using water instead is a tall order.

“The barriers to such a new aqueous battery have stymied inventors for years,” said ABC chief scientist Linda Nazar. “In addition to stubbornly low voltage and energy density, water can corrode battery materials, become the source of undesirable side reactions, and the cells can fail after just hundreds of charge-discharge cycles under demanding practical conditions.”

Dozens of researchers working in six teams will investigate the challenge.

Stanford University and SLAC National Accelerator Laboratory lead the Aqueous Battery Consortium, and are joined by investigators from California State University, Long Beach; Florida A&M University/Florida State University’s College of Engineering; North Carolina State University; Oregon State University; San Jose State University; UCLA; UC-San Diego; UC-Santa Barbara; University of Maryland; University of Texas at Austin; and the University of Waterloo.

ESRA is led by Argonne and co-led by Lawrence Berkeley National Laboratory and Pacific Northwest National Laboratory. Partners are Columbia University; Duke University; Massachusetts Institute of Technology; Princeton University; UC San Diego; UChicago; University of Houston; University of Illinois Chicago; University of Illinois Urbana-Champaign; University of Michigan; Utah State University; and Xavier University.

DOE’s Energy Innovation Hubs are managed by the Basic Energy Sciences (BES) Program of the Office of Science. They seek to overcome key scientific barriers for major energy technologies.

ABC and ESRA build off previous BES-funded efforts, including the Joint Center for Energy Storage Research innovation hub, which ended a decade of operation in 2023 and also was led by Argonne.

MISO: 50% Peak Load Cap, Software Help Key for Crowded, Delayed Queue

MISO is adamant it should limit project proposals in future queue cycles to 50% of annual peak load to moderate its 300-GW, oversaturated queue. 

During a Sept. 3 Interconnection Queue Process Working Group teleconference, MISO’s Ryan Westphal said an annual megawatt cap, in conjunction with a tech startup’s software for study help, will allow MISO to build study models faster without the “engineering problem” of too many hypothetical overloads, network upgrades and resources exceeding load. (See MISO Sets Sights on 50% Peak MW Cap in Annual Interconnection Queue Cycles.) 

“We think that gives us the best chance of moving faster,” Westphal said of overall queue processing.  

MISO’s current queue stands at 321 GW.  

Westphal said a cap won’t hinder MISO’s resource adequacy, either. Using the queue’s historical 21% completion rate, Westphal said MISO stands to add roughly 67 GW within a few years, even with capping entrants. 

In the long run, MISO estimates 310 GW will be able to hook up to the system through 2042. Westphal pointed out that figure exceeds the 248 GW of additions by 2042 that the RTO uses in its current transmission planning scenario. Westphal said that number should allay stakeholders’ resource adequacy concerns, where a cap might restrict too many projects from connecting. 

Clean Grid Alliance’s Rhonda Peters asked how MISO plans to factor in significant new load additions in its annual megawatt cap.  

Westphal said the cap calculation will be based on peak load using MISO’s five-year out models, which should capture definite load additions.  

“Using what’s in the models as firm makes the most sense,” he said.  

Westphal also said MISO’s attempt to conquer its unwieldy queue using Pearl Street’s study software will not negate the need for a cap.  

MISO will lean on Pearl Street’s SUGAR (Suite of Unified Grid Analyses with Renewables) software to conduct screening of projects prior to conducting studies in earnest and to perform the first phase of studies in the queue.  

MISO has delayed kickoff of studies on the 123 GW of projects that entered the queue in 2023 while Pearl Street assists with modeling. When the 2024 cycle will begin is an open question, since the RTO intends to have the cap in place before it formally accepts a new cycle. (See 2023 Queue Cycle Delayed into 2025 as MISO Seeks Software Help on Studies.)  

“We have long been proponents of technology adoption in this space,” NextEra Energy’s Erin Murphy said, thanking MISO for reaching out for third-party help. However, Murphy said that if MISO and Pearl Street can achieve faster study results with more variables, that could negate the need for a megawatt cap on annual queue cycles. 

Westphal said while SUGAR may help speed up study processing, without a queue cap, MISO still would run into the familiar problem of unrealistic dispatch models overflowing with too many projects. 

Westphal said MISO still wants to return to its usual cadence of one-year queue cycles where submissions are accepted in the fall, validated through the holidays and begin studies in the new year, after MISO’s Board of Directors approves MISO’s planning models. However, Westphal added that it doesn’t make sense to accept a new queue cycle if the previous cycle isn’t far enough along in the study process.  

Last week, the Union of Concerned Scientists’ Sam Gomberg suggested MISO plan to play “catch up” on queue studies if Pearl Street’s software proves successful. He suggested MISO consider accepting multiple queue cycles in a year to get back on track.  

MISO is not entertaining using a volumetric price escalation — where developers pay fees that increase as they submit more projects for study — in lieu of a cap, as some stakeholders requested. Westphal said enacting escalating fees won’t solve MISO’s underlying “technical issue” of trying to study “load being served by too many generators.” 

“In our minds, it’s not an alternative for a cap. We still believe we need that hard cap there to get us to reasonable study parameters and dispatch [model],” he said.  

Several MISO generation developers argued that a volumetric price escalation would encourage interconnection customers to put only their best projects forward, discourage manipulation of the queue and allow small developers and co-ops an even playing field for submitting projects. 

Savion’s Derek Sunderman said MISO could police the volumetric approach by requiring large corporations to sign forms attesting to their subsidiaries. Sunderman requested that the RTO conduct a stakeholder vote to gauge whether stakeholders prefer the cap or a volumetric price escalation.  

Some stakeholders have asked MISO to consider giving developers estimated network upgrade costs at the screening stage of queue, if Pearl Street is proven effective at anticipating results.  

MISO still is drawing up a plan to reevaluate the queue cap after three annual cycles, Westphal added. A few stakeholders have asked the RTO to view the cap as a short-term measure and commit to sunsetting the cap after three years of use.  

“Unfortunately, there’s no silver bullet on the queue. It’s just constant improvement,” Westphal said. 

MISO has scheduled a special meeting Sept. 30 to discuss the queue cap again. Westphal said he hopes to present “a final go” of the queue cap by then, make a filing at the end of October and earn FERC approval by the end of the year. 

DC Circuit Strikes down Emissions Standards for ‘New’ Pre-2020 Boilers

A three-judge panel of the D.C. Circuit Court of Appeals on Sept. 3 set aside EPA emission rules on new large boilers as they applied to those built prior to August 2020, ruling that was a violation of the Clean Air Act (22-1271). 

The rules, issued in October 2022, set National Emission Standards for Hazardous Air Pollutants (NESHAP) for major sources focused on industrial, commercial and institutional boilers. A source is considered “new” under CAA Section 112 if it is built after EPA proposes an emission standard for that source, which the agency did for boilers in August 2020. The court found that EPA had improperly classified certain industrial boilers built before then as “new.” 

In doing so, the court agreed with industry petitioners, led by U.S. Sugar, which completed building a boiler to help power its facility in Clewiston, Fla., in 2019 at a cost of $65 million to replace three older and higher-polluting boilers to comply with standards issued by EPA in 2011. 

Boilers burn materials such as coal, paper and agricultural waste to create heat, electricity and other forms of energy. That comes with emissions of pollutants like mercury, carbon monoxide and particulate matter. 

U.S. Sugar’s boiler also “surpassed” EPA’s 2022 standards for existing sources, the court said. But under EPA’s rules, it was considered a new source. “Under this regime — whose logic suggests that boilers built after June 4, 2010, are forever ‘new’ — the U.S. Sugar Corp. must spend tens of millions of dollars retrofitting” the boiler, it said. 

EPA is supposed to base its standards around the maximum achievable control technology (MACT), which can vary between new and existing sources. New sources are supposed to meet a standard at least as strict as the emission control that is achieved in practice by the best controlled similar source, while existing sources have to meet one at least as stringent as the best performing 12% of operating sources for which EPA has emissions data. 

The agency argued that because it was using the same dataset as when it proposed the 2011 standards, the cutoff date for whether a source is “new” is June 4, 2010, when the proposal was first published.  

But the court found that “when Section 112 references the date ‘an emission standard’ is ‘first propose[d],’ it means the first proposal of each consecutive standard.” It noted that while existing sources are given three years to comply with new standards, new sources are expected to be in compliance upon their effective date. The court pointed to other cases challenging EPA rulemakings for other types of new sources under Section 112 in which the agency also noted this in its arguments. 

“EPA itself has explained that retrofitting older sources to comply with increasingly stringent modern standards may be ‘draconian’ if not ‘impossible,’” the court said. “And we should not lightly assume that a statute is ‘draconian’ or ‘demands the impossible.’” 

Environmentalists including Sierra Club also appealed the rules, but because they said EPA used old data, despite more recent data being available. But the court found that did not violate the CAA. 

In other cases, the court has generally acknowledged that EPA may exercise discretion and use its expertise to calculate standards. The environmentalists’ view would offer no discretion to EPA when choosing its data, which could force the agency to use faulty data if that was all it had, the court said. 

“Because that interpretation of Section 112(d) would substantially hamper EPA’s ability to effectively promulgate standards, we reject environmental petitioners’ interpretation and hold that EPA’s decision to rely on its original dataset was not unlawful,” the court said. 

Heat Pump Tech Could Help Decarbonize Dairy Sector, CEC Says

The California Energy Commission (CEC) is exploring the use of heat pump technologies to accelerate decarbonization in the dairy sector, which accounts for 2.5% of the state’s energy consumption and 1.4% of greenhouse gas emissions.

Home to over 1,100 dairy farms and over 130 dairy product processing facilities, California leads the nation in milk production and is the second largest cheese producer. In 2020, the U.S. dairy industry announced a goal of net zero carbon emissions by 2050, and commissioners identified that transitioning from thermal resources to heat pump technologies for processes like pasteurization, evaporation and cleaning could lead to significant energy savings while reducing reliance on fossil fuels.

“Knowing how important the dairy sector is to California’s economy and knowing we could bring some innovation to the sector, [we can] really work together with industry to improve the carbon footprint,” CEC Commissioner Andrew McAllister said during an Aug. 29 meeting to discuss decreasing dairy emissions.

Through the Food Production Investment Program, the CEC has awarded up to $117.8 million in grants to help food producers reduce greenhouse gas emissions, including to six California dairy facilities.

“These projects have or will improve operation efficiency and lower production costs, and in general maintained or increased the quality and quantity of production,” said Matthew Stevens, a CEC staffer representing the Food Production Investment Program. “We have done a lot of waste heat capture and storage, general system overhauls, and recently, we’re tackling to replace very inefficient, aging, high global warming refrigeration systems.”

‘Where Everybody Wins’

Several experts who focus on the decarbonization of industrial facilities presented at the meeting, all highlighting the potential for heat pumps to improve energy efficiency in the dairy sector.

Dr. Ahmad Ganji, director of San Francisco State University’s Industrial Assessment Center (IAC), said there are “significant opportunities for energy efficiency in dairy processing plants,” with efficiency increases of at least 10-15%. The IAC analyzes and informs industrial facilities about how they can decarbonize, and Ganji said it plans to recommend heat pump technologies at dairy processing plants as the technology improves.

Most of the emissions from dairy processing facilities come from natural gas-powered steam boilers used for pasteurization and other processes requiring heat, according to Arun Gupta, CEO of Skyven Technologies.

“Twenty percent of global carbon emissions are caused by industrial heat, which, for context, is about as much carbon impact as all of transportation, all of the cars, trains, planes, boats, everything combined,” Gupta said. “Half of that is steam, so steam is enormous.”

Skyven developed a new steam-generating heat pump technology, designed for use in dairy processing plants, that can generate steam for heating and cooling at lower prices than boilers that run on natural gas.

“That allows us to achieve the deep decarbonization that the industry is looking for,” he said. “Decarbonization solutions must be cost competitive with existing boilers and, better than cost competitive, they actually need to save money … Where everybody wins is where decarbonization and cost savings go hand-in-hand.”

Skyven was recently awarded a $145 million grant from the U.S. Department of Energy to deploy steam-generated heat pumps across multiple manufacturing sectors, including California dairy facilities, with the goal to make the technology an industry standard. Gupta estimated the project will cut GHG emissions by around 400,000 metric tons, produce 1,000 jobs and benefit over 300,000 people through cleaner air.

Curtis Rager, product manager at Johnson Controls, provided additional background on how heat pumps could increase the efficiency of refrigeration systems at dairy facilities. Most dairy plants use ammonia as a refrigerant that is pumped throughout the system and absorbed at the point of use.

For example, refrigerant is sent to milk silos which absorb heat that then flows through the system and is discharged into the atmosphere via evaporative condensers. The process of heating milk up for pasteurization and then cooling it back down requires a lot of heat usage, and pumps could help offset discharged waste heat.

“[Heat pumps are] capturing that ammonia refrigeration gas steam that’s going to the evaporative condenser and it’s now taking that and it’s going through another stage of compression,” Rager said. “With a second stage of compression in the heat pump portion you can pump up that temperature … and now through the condenser, you can bring the cold water in and produce hot water all from the energy that was absorbed … from those evaporators.”

The system would allow for significant energy- and water-use savings, contributing to Gupta’s goal of simultaneous decarbonization and reduction of costs.

“We believe that steam decarbonization is crucial for the decarbonization of the industry, and this technology allows that to happen in a way that is profitable for manufacturers and allows them to achieve both the savings and the carbon reductions that they’re looking for,” Gupta said.

NPCC, NYSEG Agree to Settle Control Center Violation

FERC accepted a settlement between New York State Electric and Gas (NYSEG) and the Northeast Power Coordinating Council in which the utility admitted to violating NERC’s requirements for maintaining backup control centers.

The settlement, which carries no monetary penalty, was filed by NERC in its monthly spreadsheet Notice of Penalty on July 31; it was the only settlement in the spreadsheet and the only NOP filed that month (NP24-10). In a filing issued Aug. 30, FERC said it would not further review the settlement. Commissioner Judy Chang did not participate in the decision.

The settlement stemmed from a violation of EOP-008-2 (Loss of control center functionality), approved by FERC in 2018 in order to “ensure continued reliable operations of the [electric grid] in the event that a control center becomes inoperable.” NPCC discovered the noncompliance during an audit in 2020.

According to the settlement, NPCC found that NYSEG’s backup and primary control centers used a shared communication path with a single point of failure. This contravened requirement R6 of the standard, which mandates that reliability coordinators, balancing authorities and transmission operators ensure their primary and backup control centers maintain separate functionalities.

NPCC reported that seven communications lines terminated in a single room common to both the primary and backup control centers. In the event of a “catastrophic event” at the primary control center, the utility would lose its connection with about 150 remote terminal units (RTUs), 62 of which provide data from its substations. This represents a loss of data from more than half of its 121 grid-connected RTUs.

Further investigation revealed that NYSEG had discovered the issue during a prior audit in 2017 and labeled it an area of concern. The utility first sought to address the problem with its telecommunications vendor, but the vendor delayed implementation of the proposed solution for more than a year before telling NYSEG in 2019 that it “could no longer support the solution as designed.”

NYSEG then pursued a permanent solution, which was “in the planning stages” when NPCC conducted its 2020 audit. But the regional entity said the utility did not assign the task the necessary priority or management oversight, and thus the violation lasted longer than it would have with proper prioritization. Along with EOP-008-2, NPCC also found that NYSEG had violated the standard’s predecessor, EOP-008-1, which was in effect when NYSEG registered as a transmission operator and was required to comply with it.

NPCC assessed the violation as a moderate risk to grid reliability. It pointed out that the shared point of failure would have reduced NYSEG’s visibility into its system and compromised its ability to work remotely if the primary control center became inoperable. The RE said a catastrophic event compromising the primary center “would likely be a long-duration event,” exacerbating the risk.

At the same time, the RE acknowledged that the risk of such a catastrophic event affecting the primary control center is low. It also pointed out that even if NYSEG lost its ability to monitor the system, NYISO and neighboring TOPs and BAs could still monitor their respective systems, ensuring some visibility into the grid’s health.

NPCC determined that no monetary penalty would be required in light of NYSEG’s cooperation in the enforcement process, lack of prior relevant noncompliance and agreement to settle the matter rather than calling for a hearing. However, the RE did feel it necessary to elevate the matter to the spreadsheet NOP because of the length of the noncompliance and the fact that it became aware of the issue through a compliance audit rather than the utility reporting the problem itself.

To mitigate the problem, NYSEG removed the single point of failure by migrating the communications lines. It also created a new NERC compliance tool to monitor compliance projects and make sure schedules are maintained properly, trained relevant personnel on the tool, and updated its project management procedures to specify that leadership must review the project management plan when changes to a project’s schedule are needed.

Texas PUC Sets Reliability Standard for ERCOT

Texas’ regulatory commission has adopted a reliability standard for the ERCOT region, one of several policy parameters that will be used in upcoming analyses for the proposed performance credit mechanism (PCM) market design. 

As approved by the Public Utility Commission during its Aug. 29 open meeting, ERCOT must meet three criteria to comply with the reliability standard: frequency, duration and magnitude. To meet the standard, ERCOT outages should not occur more than once in 10 years on average, last more than 12 hours or lose more power than can be safely rotated (54584). 

“Our system must continue to evolve to meet the growing demand for power in our state … it’s critical we clearly define the standard at which we expect the market and system to operate,” PUC Chair Thomas Gleeson said in a statement. “By establishing a reliability standard for the ERCOT region today, we are setting a strong expectation for the market and charting a clear path to further secure electric reliability.” 

The new rule also establishes a process to regularly assess the ERCOT grid’s reliability. The commission directed ERCOT staff to conduct a probability-based assessment every three years, beginning Jan. 1, 2026, to determine whether the system is meeting the standard and is expected to continue to do so over the next three years.  

Should that assessment indicate the system fails to meet the reliability standard, the Independent Market Monitor (IMM) must conduct an independent review and commission staff must recommend their own potential market design changes. The PUC then would review ERCOT’s assessment, the IMM’s review, commission staff’s recommendations and public comments to determine whether any market design changes are necessary. 

ERCOT and IMM staff confirmed during the meeting that they have all they need to begin their respective analyses. Draft results are due to the PUC in early November; the commission will consider the final results in December. 

The ISO said it will use 19 GW as the amount of load it can safely rotate during an outage in its cost/benefit analysis, as it proposed in an April research paper. 

The reliability standard was just one of several actions the PUC took to establish regular assessments of the grid’s ability to meet demand and help determine any necessary future improvements. 

It adopted a value of lost load of $35,000/MWh, using information from a survey of ERCOT consumers and a Brattle study. Staff proposed a $30,000 VOLL, but Gleeson recommended Brattle’s suggested $35,685, saying it was “reasonable” after a “detailed and thorough” analysis (55837). 

“We don’t need the extra numbers in there,” Gleeson said. 

ERCOT will use VOLL for cost/benefit analyses in its planning models. The PUC said it will not be used to update the operating reserve demand curve or any current market-design elements. 

The commission also accepted staff’s final recommendations for each of the PCM’s 37 base case parameters, including a firm $1 billion gross cost cap to comply with state law (55000). ERCOT had proposed a counterfactual of energy-only market equilibrium reserve margin instead of the cost cap, a “purely theoretical number,” according to Stoic Energy principal Doug Lewin. 

PUC staff and ERCOT also differed on four other parameters: the metric to determine performance credit (PC) hours; a duration-based cap for consecutive PC hours; the net-cost cap compliance framework; and non-performance penalties for PCs offered but not cleared in the forward market. 

The PUC selected the PCM from among five other suggested market reforms as its design of choice and approved it in 2023. That same year, the Texas Legislature passed a bill setting a $1 billion annual cap for the PCM. (See Texas PUC Submits Reliability Plan to Legislature.) 

The PCM will use the reliability standard and a corresponding quantity of PCs that must be produced during the highest reliability risk hours to meet the standard. Load-serving entities can purchase PCs, awarded to resources through a retrospective settlement process based on availability during hours of highest risk, and trade them with other LSEs and generators in a forward market; generators must participate in the forward market to qualify for the settlement process. 

CPS Energy MRA, RMR Update

ERCOT told the PUC it has changed course on must-run alternatives for three retiring CPS Energy coal units, postponing an inspection of the largest unit until after the winter season (55999). 

The San Antonio municipality told the commission this year it planned to retire the three coal units, which date back to the 1960s, in March 2025. However, ERCOT said the Braunig Power Station units, with a combined summer seasonal net maximum sustainable rating of 859 MW, were needed for reliability reasons and issued a request for reliability-must-run proposals in July. (See ERCOT Evaluating RMR, MRA Options for CPS Plant.) 

The grid operator said in an update to the commission that while it continues to negotiate a potential agreement with CPS Energy to inspect the 412-MW Unit 3, it would be “more prudent” to allow the resource to operate through the winter’s peak demand period. ERCOT staff said the inspection could be held in mid-February or early March. 

“If we waited until after winter peak load, we believe we’d still have plenty of time, barring unforeseen circumstances, to have the unit inspected and repaired during another shoulder season for outages and before the summer peak load season,” ERCOT’s Davida Dwyer said. 

The ISO extended the deadline for RFP responses to Oct. 7 after receiving fewer than 10 proposals to its initial request. (See “ERCOT Extends MRA Timeline,” ERCOT Board of Directors Briefs: Aug. 19-20, 2024.) 

Chad Seely, the ISO’s general counsel, told the commission the deadline would provide an “important data point” in seeing whether the industry has responded with enough MW to provide relief for a constrained area south of San Antonio. 

“The additional time affords us a more deliberative process on these critical policy issues to see if the industry is going to respond to the must-run alternative,” Seely said, “and then continue to move forward [on] a path where we still think it’s appropriate and prudent for reliability to start to open up the unit in advance of any April 1 RMR agreement.” 

“Is it looking bleak on the MRA?” Commissioner Lori Cobos asked Seely.  

Noting that ERCOT has amended the RFP after stakeholder feedback, he said, “We’re hopeful, with the amendments that we put forward and allowing almost another month of time for people to go do their due diligence, and talk to their shops about options, that we will see a higher [number] of offers come in in October.” 

“Ultimately, I don’t want RMR to be the norm, right?” Cobos responded. 

Seely said the three units are in a “prime” location to relieve the constraint’s interconnection reliability operating limits (IROLs), which makes the pre-RMR inspection work such an “extraordinary situation.” 

“[Braunig] is one of the best assets right now in the system, until we see other solutions to help relieve the overloads of the IROL for the next couple of years,” he said. “That’s why it’s critically important to be deliberative and these critical policy issues on how we approach this.” 

CPS has said it will cost about $22 million to inspect, repair and prepare Braunig Unit 3 to remain in service past March and an additional $35 million for the other two units. 

Utility and energy storage company Eolian announced Aug. 28 an agreement for two storage facilities south of San Antonio totaling 350 MW of capacity. The projects are not expected to come online until 2026, but work to upgrade the transmission infrastructure and relieve the South Texas constraint isn’t expected to be completed until the middle of 2027. 

Counterflow: Back to the Future

It seems like yesterday I started scribbling about all manner of industry subjects — against the flow, the prevailing wisdom, the latest hype, etc. 

But it’s actually been 10 years. With that passage of time, spanning 90 columns and articles all available here, I thought I’d look back at what I might have gotten right, gotten wrong or whatever. And what such might portend for the next 10 years. 

Let’s start with — who else — Elon Musk and his claims for his new home battery, the Powerwall. Including pairing with his cousins’ SolarCity’s solar panels. Powerwall and SolarCity didn’t live up to Musk’s early hype, as I discussed in follow-up columns (one more), but they finally became a profitable part of Tesla. Maybe I should get partial credit.  

Steve Huntoon |

Next subject was Big Transmission (not to be confused with economic interregional ties). Back then, I summarized the prior 10 years: “It was heady stuff: Big lines and arrows sweeping across the country, depicting massive new transmission projects. But after 10 years of dramatic announcements and proposals, the reality today is that Big Transmission has fallen and it won’t be getting up. And a second reality is this: The fall of Big Transmission is not a public policy failure. Rather, Big Transmission never did make sense. Instead, the experience so far points to a continuation of what we’re doing now — to more of the incremental transmission expansions that have characterized the past 10 years — and not to count on Big Transmission as a solution to any future industry challenge.” Another 10 years and the song remains the same.✔️  

On to microgrids! I showed that microgrids are the irrational antithesis of everything we know about electric system planning and operation. A couple years later, I discussed the threat microgrids posed to national security, did a recap a couple years later, and then this year covered the microgrid boondoggle in Chicago. ✔️  

Next up were utility-scale batteries. I showed that the two claimed value propositions, capacity backup and energy arbitrage, didn’t pencil out. Battery costs have since come down significantly, but batteries remain a niche product absent subsidies and/or mandates. Hmm, maybe another partial credit. 

On to New York’s REV (“Reforming the Energy Vision”). As I said back then, it was the most hyped regulatory initiative since the California restructuring some 20 years prior. REV was mostly word salad, but one of the few specifics was subsidizing utilities to install rooftop solar. I couldn’t imagine a worse idea. ✔️  

Well, except maybe California’s artificial creation of the Duck Curve by layering one bad policy on top of another. Free storage and distribution in the form of net metering, uneconomic time-of-use rates discouraging afternoon usage, subsidies of battery storage reducing afternoon usage. Yikes. Many years later, the Duck is finally getting targeted, but not before helping drive California’s electric rates to astronomical levels. ✔️ 

Another close contender from California was the planned closure of the Diablo Canyon nuclear plant. Perhaps my hair-on-fire column helped save the plant … and perhaps helped the planet. ✔️  

Oh, and lest we forget Bernie Sanders’ promise to ban fracking during his 2016 campaign. Not only to cost consumers some $100 billion annually, but to increase carbon emissions by increasing coal-fired generation. Yikes! ✔️  

Enough reminiscing for one day! 

P.S. Except to add to prior columns’ postscripts about Peace, Love and Understanding. Here’s an audio version by the guy who wrote it, Nick Lowe. Oh, and this cover by Elvis Costello 20 years ago is epic. 

As Spinal Tap said: Turn it up to 11. 

Columnist Steve Huntoon, principal of Energy Counsel LLP and a former president of the Energy Bar Association, has been practicing energy law for more than 30 years. 

DOE Approves 1st LNG Exports Since Biden Administration’s Pause

The Department of Energy on Aug. 31 approved a five-year term for New Fortress Energy’s Fast LNG 1 project to export gas produced in the U.S. to countries without free trade agreements (FTAs).

The LNG facility recently started operations in Altamira, Mexico, and will receive U.S.-produced gas via pipeline to export. It announced its first exports in August, having already won approval to ship gas to countries with FTAs.

The authorization comes after a court stayed the Biden administration’s pause on such approvals, announced earlier this year, and while DOE works on a related study on the environmental impacts of LNG exports. (See Federal Judge Stays Biden’s LNG Export Application Pause.)

“This important authorization cements NFE’s position as a leading global vertically integrated gas to power company and enhances the marketability of our FLNG 1 asset,” CEO Wes Edens said in a statement Sept. 3. “NFE is now able to freely supply cheaper and cleaner natural gas to underserved markets across the world and further our goal of accelerating the world’s energy transition.”

DOE approved the facility to ship 145 Bcf/year of U.S.-produced LNG. The gas will flow into Mexico over the Valley Crossing Pipeline, which runs south from Texas, and potentially other cross-border pipelines that have yet to be completed.

The exports to non-FTA countries give NFE more flexibility with the facility, DOE said.

“These re-exports can diversify global LNG supplies and improve energy security for U.S. allies and trading partners,” the department said. “Based on this administrative record, DOE has determined that it has not been shown that NFE Altamira-proposed re-exports of LNG to non-FTA countries will be inconsistent with the public interest over the authorization period.”

DOE’s approval is in effect for five years, until Aug. 30, 2029, but NFE wants to keep exporting gas until 2050. The department will reevaluate its approval once the company formally asks for a new end date.

So far, DOE has approved 46.45 Bcfd of natural gas exports, which includes 6.71 Bcfd of gas shipped to Canada and Mexico before being exported overseas.

North America’s export capacity is on pace to double by 2028, from 11.4 Bcfd to 24.4 Bcfd, the Energy Information Administration said Sept. 3.

The U.S. is home to 9.7 Bcfd of projects under construction, with Canada building 2.5 Bcfd and Mexico 0.8 Bcfd. The Canadian facilities would export gas produced there, but the Mexican facilities are seeking to export gas initially produced in the U.S.

In approving NFE’s application, DOE said it would monitor market developments closely as the impact of successive authorizations of LNG exports continues to unfold.

“DOE also acknowledges that proposals to re-export U.S.-sourced natural gas in the form of LNG from Mexico or Canada to non-FTA countries raise public interest considerations that are not present for domestic exports of LNG,” DOE said. “In the case of re-exports, the U.S. economy does not receive a significant portion of the benefits DOE has recognized for LNG exported directly from the United States, particularly with respect to the jobs and infrastructure investment associated with construction and operation of liquefaction facilities.”

Foreign LNG export facilities are also not subject to U.S. environmental laws, which could lead to long-term issues if local laws are laxer, DOE added.

The export application was opposed by environmentalists, with Sierra Club protesting and Food & Water Watch releasing a statement blasting the approval.

“It’s ridiculous that the Department of Energy would issue this license despite the administration’s ongoing, incomplete public interest review of such exports,” said Mitch Jones, managing director of advocacy and policy. “The department is under no obligation to approve these ill-advised proposals, now or ever. As the disastrous impacts of increased fossil fuel development become more and more obvious here and around the globe, the notion of expanded LNG exports should be dismissed out of hand.”