A new report predicts the U.S. offshore wind buildout will fall short of President Biden’s 30-GW-by-2030 goal despite investment of a projected $65 billion over the next six years.
The American Clean Power Association’s 2024 Offshore Wind Market Report projects the 30-GW milestone will be reached in 2033 and that only 14 GW will be operational by 2030.
The industry ran into serious problems with costs, component availability and infrastructure just as it was gaining some momentum in the United States with the help of federal and state policymakers.
Contracts for numerous projects were canceled, injecting uncertainty and delays into their construction timelines.
But ACP sees a bright future for offshore wind in the United States:
A total of 56.3 GW of capacity is in some stage of development in 37 leases, and the U.S. Bureau of Ocean Energy Management plans to hold four auctions this year for 1.9 million acres of federal waters that hold a potential capacity of more than 20 GW.
BOEM has greenlighted 12 projects in nine lease areas and is reviewing seven other projects.
Offtake agreements are in place for 12 GW of electricity generated offshore, and active solicitations underway in the Northeast could yield 8.8 GW to 12.2 GW of additional contracts in the second half of this year.
South Fork Wind, the first utility-scale offshore wind farm, was commissioned this year and three larger projects with a combined capacity of 4 GW — Coastal Virginia Offshore Wind, Revolution Wind and Vineyard Wind — are under construction.
Infrastructure investment announcements now exceed $9 billion, with $3 billion in 2023 alone; more than 40 new support watercraft are on order or under construction, including two types of installation vessels.
The sector is projected to support 56,000 U.S. jobs by 2030.
In a news release July 9, ACP Chief Policy Officer Frank Macchiarola said:
“After the successful startup of the 132 MW South Fork wind farm earlier this year, and with 136 MW operational at Vineyard Wind, offshore wind is gaining momentum with three projects under construction and 37 more in development. Harnessing America’s offshore wind resources will boost economic activity, create jobs, reduce pollution providing environmental and public health benefits, and strengthen America’s energy security by enhancing grid reliability and energy independence.”
ACP in its announcement did not mention the possibility of a second presidency for Donald Trump, an outspoken wind power opponent.
November election notwithstanding, there are some bright points ahead in 2024:
New Jersey’s fourth offshore wind solicitation is active.
New York plans a dual solicitation this year — one for offshore wind farms, one for supply chain investments to support offshore construction and operations.
Connecticut, Massachusetts and Rhode Island expect to announce the results next month of a joint solicitation for up to 6 GW of new projects.
BOEM plans lease auctions in the Central Atlantic region in August, the Gulf of Mexico in September, and Oregon and the Gulf of Maine in October.
Construction is expected to begin on Sunrise Wind.
FERC has approved ISO-NE’s proposal of a new process to solicit, select and allocate costs for transmission projects that address needs identified in long-term planning studies (ER24-1978).
Developed in coordination with the New England States Committee on Electricity, the new process establishes a regionalized cost-allocation method for transmission projects that are projected to bring long-term net benefits to the region. (See NEPOOL TC Approves Process for States’ Transmission Needs.)
FERC Chair Willie Phillips and Commissioner Mark Christie concurred with the July 9 order in separate statements. Phillips commended the proposal and wrote that it does not conflict with Order 1920. Christie applauded the central role of the states within the proposal and contrasted it with Order 1920, which he argued needs “major revisions.”
The approval marks the completion of Phase 2 of ISO-NE’s longer-term transmission planning project; Phase 1 created a process to evaluate long-term transmission needs associated with state policies and mandates and was approved by FERC in 2022 (ER22-727).
In the new process, NESCOE can direct ISO-NE to issue a request for proposals for solutions to long-term needs. After soliciting proposals, ISO-NE will select a preferred solution, and NESCOE will have the option to either proceed with the default regionalized cost allocation method, submit an alternative cost-allocation method or terminate the process.
For projects to be eligible for selection, ISO-NE’s analysis must indicate the quantified benefits of the project outweigh its costs.
FERC also approved a supplemental process the states can use if no proposal exceeds the cost-benefit test, allowing one or more states to cover any costs of a project that exceed this cost-benefit threshold.
FERC wrote that the tariff changes “represent a just and reasonable alternative voluntary process that will not conflict with or otherwise replace ISO-NE’s Order No. 1000 regional transmission planning process.”
While the comments submitted to FERC on the proposal largely were supportive, some stakeholders argued the requirement for proposals to be complete — not reliant on any additional transmission upgrades from incumbent transmission owners not included in the proposal — equates to a de facto right of first refusal. (See Stakeholders Support ISO-NE Long-term Tx Planning Filing, with Caveats.)
FERC ultimately rejected arguments by clean energy trade groups and merchant transmission developers that the new process would give an unfair advantage to incumbent transmission owners.
Since the process is supplemental to ISO-NE’s regional transmission planning process required by Order 1000, it “need not comply with the nonincumbent transmission developer reforms established in Order No. 1000, including the requirement to eliminate any federal right of first refusal,” FERC wrote.
At the request of ISO-NE, FERC also directed the RTO to submit an additional filing to fix errors in the original submission.
Phillips wrote in his concurrence that the new process is not in conflict with Order 1920 and that it “includes many of the significant components of Order No. 1920, such as multifactor planning on at least a 20-year time horizon, an ex ante default cost allocation method, the option for states to agree on alternative cost allocation methods and the option to voluntarily pay for the portion of a project that exceeds the identified benefit-cost ratio.”
“The state role in this proposal is utterly contrary to the insufficient one allowed in Order No. 1920, which does not require that states consent to planning and selection criteria, does not require that states consent to an ex ante cost allocation formula, and does not even require that transmission providers have to file a state-agreed alternative to an ex ante formula,” Christie wrote.
Christie noted the strong state support for the proposal and argued it eventually could be undercut by the requirements of Order 1920, which is on track to “force all projects, including public policy related projects, into the same bucket with other types of projects for planning and cost allocation purposes.”
Christie concluded that the proposal “is the type of planning and cost allocation construct for public policy projects that the commission should encourage and approve,” and called for reforms to Order 1920.
Dominion Energy announced July 8 that it is positioning itself to potentially build another wind farm near its Coastal Virginia Offshore Wind.
It has agreed to buy the lease area where Avangrid’s Kitty Hawk North has been in the planning stages.
As part of the deal, Dominion would rename it CVOW-South. Avangrid would retain rights to the Kitty Hawk name, and plans to continue developing the adjacent lease area, which it calls Kitty Hawk South.
The deal requires the approval of the U.S. Bureau of Ocean Energy Management and the city of Virginia Beach. The two companies expect to close the transaction in the fourth quarter of 2024.
The deal is valued at about $160 million, which is substantially more than the original lease price but reflects development expenditures in the seven years since the auction.
In a March 2017 BOEM auction, Avangrid Renewables LLC beat out three other bidders in 17 rounds with a $9.07 million bid for the 122,405-acre lease area designated OCS-A 0508 and began planning what it called the Kitty Hawk Offshore Wind Project.
In 2022, Kitty Hawk Wind LLC divided it into two projects in two areas: Kitty Hawk North in the newly designated OCS-A 0559 and Kitty Hawk South in the remainder of OCS-A 0508.
In the construction and operations plan submitted to BOEM, Kitty Hawk North is proposed to have up to 69 wind turbine generators, one offshore substation, one onshore wind station and export cables making landfall in Virginia Beach. It would stand 24 nautical miles off the northern part of the barrier island that forms the North Carolina coastline.
Kitty Hawk South and Kitty Hawk North still have a long way to go in their permitting processes, and they need to overcome local opposition.
In late 2023, the city of Virginia Beach rejected the cable landfall routing. Three months later, Avangrid appealed to the region’s pocketbook, issuing a 29-page report estimating that the two Kitty Hawk projects would bestow a $4.8 million economic benefit on Virginia over their operational lives and that a quarter of that would flow to Virginia Beach, which would reap $274 million in tax payments alone.
Dominion said if CVOW-South were approved and built, it would have a roughly 800-MW capacity and would feed into the Dominion transmission grid. It offered no estimates of in-service date or construction budget.
Both companies acknowledged the potential roadblock in Virginia Beach.
Dominion said it’s aware of community concerns about the export cable’s proposed landing site and is committed to working closely with the community, the city and the state.
Avangrid said Kitty Hawk South could generate up to 2.4 GW of power and could be delivered to North Carolina, other states or private companies, not just to Virginia.
The two companies already have steel in U.S. waters, and both can lay claim to the mantle of “largest.”
Avangrid owns 50% of Vineyard Wind 1 under construction off the Massachusetts coast. With 10 turbines connected to the grid, it is by a tiny margin the largest offshore wind farm by capacity in the United States.
Dominion owns 50% of CVOW, which with a nameplate rating of 2.6 GW is by far the largest offshore wind farm approved in the United States.
Dominion CEO Robert Blue said in a news release: “With electric demand in our Virginia territory projected to double in the next 13 years, Dominion Energy is securing access to power generation resources that ensure we continue to provide the reliable, affordable and increasingly clean energy that powers our customers every day.”
Avangrid CEO Pedro Azagra said in a news release: “As Avangrid continues the construction of our nation-leading Vineyard Wind 1 project and the development of our diverse portfolio of offshore and onshore renewable projects, this transaction advances our strategic priorities by providing significant capital infusion for reinvestment.”
FERC on July 5 approved LS Power’s purchase of an 810-MW natural gas plant in central Pennsylvania despite some qualms from PJM’s Independent Market Monitor (EC24-42).
The deal has LS Power setting up an affiliate, Hunterstown Gen Holdings, to buy the plant, which was owned by Kestrel Acquisition, a subsidiary of the investment firm Platinum Equity Partners.
LS Power already owns 6,865 MW of electric generating capacity, but the merger would raise the Herfindahl-Hirschman Index (HHI) of market concentration by only 12 points, which is not meaningful when a market qualifies as concentrated starting at 1,000 points on the index, Kestrel said in its application. Once the deal closes, LS Power will control 7.17% of installed capacity, its analysis showed.
The Monitor argued FERC should take into account different local markets, which are changing frequently along with transmission congestion and more accurately reflect the operation of PJM’s wholesale power markets.
The merger increases LS Power’s structural market power in the aggregate energy market and the capacity markets as well, the Monitor said. It argued for some restrictions on LS Power’s bidding to get around those issues, which would have dealt with market power concerns.
LS Power and Platinum argued the Monitor failed to show the deal is inconsistent with FERC policy or precedent and that its claims are based on a nonpublic dataset that is not available for evaluation.
FERC agreed the Monitor failed to offer enough evidence that it should use smaller geographic markets based on congestion. Some of the constraints are in place for just 100 hours a year, which FERC has said is too low to show the persistence the commission requires for a new submarket to be considered.
Others are well above that threshold, but FERC said the Monitor failed to provide enough information for the potential boundaries of a new submarket around the Nottingham transmission constraint in its analysis. The Monitor also used the three-pivotal-supplier test, which is important for market power in PJM, but FERC said it never has used it in a merger case.
Increasing LS Power’s market share from 6.71% to 7.17%, while increasing the HHI by just 6 points, shows the deal will have a limited impact on the aggregate energy market, FERC said.
“With respect to the PJM IMM’s request that the commission impose behavioral mitigation measures, we decline to require the requested mitigation measures or to otherwise address general issues concerning the PJM markets in this proceeding,” FERC said.
Broad arguments about the inadequacies of FERC’s merger review process and PJM’s market power mitigation are the basis for the Monitor’s requested behavioral limits. But “as the commission has previously found, arguments based on general concerns about certain elements of PJM’s market design that are not specific to a proposed transaction under review are beyond the scope of the commission’s review of the proposed transaction,” FERC said.
None of the parties raised issues with the deal’s impact on vertical market power, rates or regulation, and the deal did not raise issues around cross-subsidization, the commission found.
Commissioner Mark Christie concurred with the order, agreeing the deal satisfies FERC’s merger review process while also saying the Monitor highlighted a real issue.
“Taken together, the PJM IMM’s evaluation and conclusion signal that the commission’s policy and regulations implementing [Federal Power Act] Section 203 may miss the forest for the trees and fail to see the larger impacts that transactions may have on the health of RTO markets,” Christie said.
Christie added that he would welcome a review of FERC’s policies that implement Section 203.
ARLINGTON, Va. — Industry leaders, experts, policymakers and regulators gathered near the nation’s capital June 25-27 to discuss how recent FERC orders will affect regional transmission planning, cost allocation, permitting, advanced transmission technologies and other factors that could improve the ability to quickly add capacity to the grid.
Over the last two years, FERC has issued several orders designed to revise transmission planning, cost allocation and permitting processes, including:
Order 2023, which revises the commission’s pro forma generator interconnection queue rules to speed up the backlogged process. It has already led to a flurry of interconnection and queue reforms by grid operators and utility balancing areas. (See FERC Updates Interconnection Queue Process with Order 2023.)
Order 1977, which implements FERC’s new congressionally mandated authority to site transmission lines in a National Interest Electricity Transmission Corridor, despite state regulators’ rejections.
While the actions have begun to bear results, the extent of the reforms and their pace vary significantly in the nation’s organized markets.
Joseph Rand, an energy policy researcher at Lawrence Berkeley National Laboratory, kicked off the summit by sharing the lab’s annual analysis of interconnection data from all seven RTOs and ISOs and 44 non-RTO utilities, representing more than 95% of the U.S.’ currently installed capacity.
The report found potential new capacity in interconnection queues is growing dramatically, with nearly 2.6 GW of total generation and storage seeking connections to the grid last year. More than 95% of that capacity is for zero-carbon resources. Solar and battery storage are the fastest-growing resources, accounting for over 80% of new capacity entering queues last year.
“I’ve been working in the energy and electric industry for maybe about 15 years now, and I don’t often get to use the unit of terawatts. One terawatt — my mind was blown,” Rand said. “Then we got to 2 TW last year and now we’re at 2.6 TW, and my mind continues to be blown by this number.
“To put that in context a little bit,” he added, “that’s actually more than two times the installed capacity of our entire electric generation fleet in the United States.”
Rand urged attendees not to place too much importance on recent industry chatter regarding concerns with load growth, resource adequacy, data center energy needs and the impulse to quickly build gas generation. He said there’s enough capacity in grid operators’ queues to meet rapidly increasing load.
“I think that kind of indicates that, again, it’s not necessarily a need to kind of shove through a lot of generation,” he said. “There’s, in fact, a need to kind of unlock this bottleneck that we’re seeing in the interconnection process to meet that resource adequacy. But, like I just alluded to with the word ‘bottleneck,’ we have some problems in this process, right?”
The chief problem, Rand said, is the “very low” completion rates for projects. He said researchers have found that only about 20% of projects that enter queues reach commercial operation and over 72% have withdrawn their applications.
Not that low completion rates are “entirely a bad thing,” Rand said.
“Low completion rates could be a sign of a very active and competitive interconnection process and a competitive market,” he said. “But on the other hand, when you see very low completion rates, those of you from ISOs and RTOs could probably attest to this, it’s really a drain on transmission provider resources to have to study all of these requests. It might be an indicator of so-called exploratory or speculative requests being in this process.”
The other problem? Timelines, especially in FERC-jurisdictional regions, and the rising cost of interconnections.
“This stuff matters and it’s important and it’s why interconnection is sort of top of mind and getting headlines in The New York Times these days,” Rand said.
ERCOT Offers GI Lessons
If there’s a model to ease the bottlenecks in GI queues, it could be ERCOT’s “connect and manage” approach to transmission interconnection. The Texas grid operator focuses its studies on the local upgrades needed for a project to connect to the grid. It manages grid congestion caused by a new generator through market redispatch and curtailment.
ERCOT has added more generation to its system than any other grid operator and transmission provider during the last few years. It connected 14.2 GW of capacity during 2021 and 2022. PJM, with demand twice as large as ERCOT’s, added 5.6 GW during that same period. ERCOT says its lack of FERC jurisdiction status allows it to energize transmission lines in three to six years, compared to seven-and-a-half to 13 years elsewhere.
During a panel discussion on the “connect and manage” approach, Mario Hayden, Enel North America’s transmission director and a former ERCOT staffer, said familiar concepts in other regions get thrown out the window in the grid operator.
“There’s no such thing as a queue priority, which is sort of a fundamental concept in other parts of the country. There’s no sense of other readiness milestones, the concept of site control being a big barrier of entry,” he said. “You do not have a sense of withdrawal penalties, a sense of harming your competitors next to you, direct-cost allocation as a part of the negotiating process, but no idea of transmission upgrades that may cause high costs and make people want to withdraw.
“If you’re a lucky generator, the process is fairly simple,” Hayden said.
Except allocated transmission costs come later. Zero Emission Grid founder Mike Tabrizi said that without a cluster-study process, developers are connecting to the grid at their own risk.
“You’re not going to be responsible for transmission operating costs upfront but once the project becomes operational, then you’re fully exposed to what’s going to happen to the transmission from the congestion … especially if you’re a network resource,” he said.
“[ERCOT lacks] any sort of proactive transmission planning which, as we’ve seen, there can be [generation transmission constraints] that pop up and that will lead to cascading outages if they’re not managed properly,” Pine Gate Renewables’ Regan Fink said, referring to ERCOT’s South Texas constraints.
Tyler Norris, formerly with Cypress Creek Renewables and now working on a doctorate at Duke University, calls the ERCOT’s use of curtailment for renewable energy “flexible interconnection,” although he would prefer the curtailment be “occasional.”
“Once you’ve decided to use the interconnection process to identify and allocate funds for network upgrades, that introduces a lot of complications and sort of the fundamental linkage that we’ve made as we have linked interconnection service to capacity eligibility,” he said. “That’s sort of what I think is driving a lot of the issues that we’re seeing in our interconnection queues.”
FERC staff have scheduled a workshop Sept. 10-11 on GI innovations and efficiencies, Norris noted. He expects flexible interconnection options to be part of the discussion.
“There’s ever more pressure to get more generation on the system,” Norris said. “The FERC staff generally really get that there are a lot of colliding trends. … FERC will be interested in exploring reform options for energy-only interconnection service and provisional service to streamline them and make them more aligned with generator willingness to be curtailed, to get online more quickly.”
More Transmission Coming
A panel discussing the effects of Order 2023’s compliance plans said many regional markets are already a step ahead of FERC. That and competition between the states will continue to lead to strong projects, they said.
Matt Pawlowski, vice president at NextEra Energy Transmission and a “transmission guy” who wants to “build more transmission,” pointed to reforms SPP has made in its GI process to eliminate a backlog of project requests that dated back to 2017. The RTO hopes to clear all requests submitted through 2022 by the end of this year. (See “Staff Reveals Error in GI Queue Studies; Clearing Backlog Still on Course,” SPP Markets and Operations Policy Committee: April 16-17, 2024.)
“[SPP’s reforms] increased site control requirements, increased study deposits and more at-risk study deposits. We supported that because we felt like that would create stronger projects in the interconnection queue,” Pawlowski said. “It certainly sharpened the pencils on our side for ensuring other projects that get into the queue are some of our top projects that we’ve that we’re looking at from both from a demand standpoint, but also from site control and other aspects. Having a little bit more stringent requirements … just forced developers to just think harder about their projects.”
Bill Bojorquez, a former ERCOT executive and now CEO of technology firm Splight, said competition between states for transmission solutions has increased, with siting decisions “taken in by more and more regulatory and government forms.”
“If they don’t do something, we’ll have transmission in other states, wherever is more proactive in justifying that addition,” he said.
“I can’t agree with you more,” Pawlowski said. “I think the states that are being proactive right now in thinking about economic development first and then the steps that are needed in order to fulfill those jobs and economic development plans are going to win. [States] like Oklahoma and California, with some of their plans, are recognizing these. These [massive data centers and cloud infrastructure] want to be close to eyeballs and there are certain areas that they want to pick. If you’ve got them in that area and you don’t have the ability to serve them, they will go somewhere else, because they don’t have the power needs that serve them. It’s as simple as that.”
Allocating Costs the Issue
Finding herself sitting next to FERC senior energy industry analyst David Tobenkin, North Dakota Commissioner Sheri Haugen-Hoffart, an apparent opponent of Order 1920, was quick to respond after his explanation of the order and the states’ role in its cost allocation.
“For full disclosure, when David sat down, he said, ‘Be nice.’ I said, ‘I promise. I will,’” she said as Tobenkin allowed himself a smile. “But I kind of have to give you a look. 1920?”
“We are an export state. We have different challenges compared to other states, but cost allocation — and I wish I wrote down every time I heard cost allocation is challenging — is very challenging and complex,” Haugen-Hoffart added. “We have viewed it in North Dakota as cost-causation principles must be a year two, transmission and interconnection investment caused by companies that state a desire to new generation, regardless of type, should be paid by those parties, the cost causers. To allocate costs of such investment to all customers, in particular RTOs, we see as unjust and unreasonable.”
Yes, allocating costs for transmission is tricky, said WIRES’ executive director, Larry Gasteiger.
“The two biggest obstacles we keep seeing with getting transmission built, and this is really a gross oversimplification, are nobody wants to see it and nobody wants to pay for it,” he said. “Believe me, I get it. We’re talking about a lot of investment. We’ve got a lot of policies going on concerning the issues of paying for this infrastructure, and frankly, it’s going to cost a lot of money.
“I think it’s going to require a conversation and being more honest with ratepayers about what we may be seeing down the road,” Gasteiger added. “That doesn’t mean we don’t look at ways to try to minimize the costs associated with building this infrastructure, absolutely. We have to do that. But we’re not going to cheap our way out of this.”
Speaking on a panel focused on grid-enhancing technologies (GETs), EDF Renewables’ Temujin Roach shared similar opinions.
“People are going to start talking about transmission. It’s become a bigger and bigger issue because it’s going to become a bigger part of their bill. Before it was just pennies. Nobody cared about whatever transmission is,” Roach said. “Now, it’s that much more, so we’ve got to find a way to cut some of the costs while we’re increasing the costs, because we are going to increase the cost. It is going to cost more. We are going to have to charge people to build this transmission, period. So now it’s about how can we manage the cost?”
Clements’ Contributions Recognized
Conference organizers thanked outgoing FERC Commissioner Allison Clements, making her annual appearance to the Infocast summit on her penultimate day on the job, for being “a champion of the industry.”
The WATT Coalition went one better, presenting its 2024 Grid Innovation Champion Award to Clements for being “a true leader in embracing innovation” and advancing transmission technology policy at FERC.
“[Clements] has looked for opportunities to use common-sense solutions like [GETs] to support the FERC’s mission to ensure just and reasonable rates and reliable power,” WATT Chair Hilary Pearson said. “The WATT Coalition thanks Commissioner Clements for taking the time to understand the value of [GETs] on the transmission grid and for consistently advocating for policy to address the structural barriers to grid modernization … we hope her colleagues on the commission will carry forward after her term ends.”
Clements has been an outspoken supporter of GETs. She has praised the technologies in FERC Orders 881 and 2023, in letters to legislators, and in her remarks at NARUC’s Federal-State Joint Task Force on Electric Transmission and other forums. (See FERC’s Clements Gets GETs’ Benefits to Grid.)
“I didn’t think I’d become a champion for grid-enhancing technologies. I didn’t know what one was, and I feel like this real Pollyanna running around cheering for this hardware and software,” Clements said. “But it kind of came on to me because you get one group come in and they say, ‘These are the actual results in savings. These are the actual congestion-cost savings that we got in one year. And this is how much it costs to put it in place.’
“And then you get the advanced conductor guys coming in and saying, ‘This is actually the difference you can make to what’s happening on the grid, whether it be related to sag and wildfire safety, whether it be to making more training or sending more electrons through what-have-you’ … the numbers are so staggering. I think there are pretty credible studies related to the opportunity that hardware and software have to create space and even especially to increase reliability,” she added.
A day before leaving FERC, Clements took solace in her award.
“This just means the world to me. I really appreciate it,” she said with a final commission meeting still on her schedule. “I’m going to take a long vacation and then I’ll be back cheering for grid-enhancing technologies in one capacity or another.”
The vacation began early. Clements stayed around after the ceremony, greeting well-wishers while clutching her award in one arm and a beer in her other hand.
CONROE, Texas — Hurricane Beryl ripped through the Houston area after making landfall on the Texas Gulf Coast on July 8 as a Category 1 storm, leaving a trail of death and destruction in its wake.
Downgraded to a tropical storm by late morning, Beryl’s high winds caused significant tree damage in the heavily wooded region of Texas. Cleanup and restoration are expected to take days during muggy conditions with near-term projected temperatures in the low to mid 90s.
Falling trees killed at least two people and took down numerous power lines. As of 4 p.m. CT, more than 2.78 million Texas customers were without power, according to PowerOutage.us.
CenterPoint Energy, the primary utility in Houston, accounted for more than 2.18 million customers without power; it has about 2.6 million overall. It said it had a restoration workforce of approximately 4,500 ready to assist in the efforts.
Entergy Texas, with more than 247,000 customer outages, said its crews would begin a damage assessment throughout its 27-county service area once the storm passed. It has 500 additional restoration workers on standby to assist its crews.
Beryl came ashore around 4 a.m. near Matagorda Bay south of Houston, packing 80 mph winds. It was downgraded to a strong tropical storm about six hours later, with maximum sustained winds of 70 mph. Winds were down to 45 mph as the storm continued north at 4 p.m. CT.
The National Weather Service placed the Houston and Galveston metro areas under a flood watch through the morning hours of July 9. Rain and windy conditions were expected intermittently through early evening July 8.
Both of Houston’s airports had ceased operations by noon July 8, but they restored traffic later in the day.
ERCOT said on social media it was monitoring the storm and its aftermath. It said any outages are “local in nature and not an ERCOT grid reliability issue.”
Beryl brought with it painful reminders for some residents of a derecho that hit the Houston area in May with wind gusts of more than 100 mph. The storm killed eight people, brought down trees, blew out windows in downtown skyscrapers and left some people without power for more than two weeks.
Ironically, a Texas House State Affairs Committee scheduled for July 8 to conduct oversight of utility resilience plans was canceled because of the storm.
Public utility regulation falls within the lexicon of economic regulation with its main objective to protect consumers from the monopoly power of a utility. The presumption is that public utilities provide essential services that require strong service obligations and price controls. It also is inferred that a single private firm would be preferable to allowing the entry of a number of potentially competing firms.
In recent years, state utility regulators have exhibited much more political posturing that deviates from their original mission, often mandated or coerced by the legislature and governor. For example, we have seen regulators approving higher utility rates to advance the agendas of politically influential interest groups like social justice activists. Many regulators have become advocates of the environmental, social and governance (ESG) movement that has spread widely across the corporate and political worlds.
Kenneth W. Costello |
The upswing in special-interest demands afflicting most states comes from clean air advocates, vendors and others who are not utility customers. Their presence in the regulatory arena has proliferated to squeeze out public interest goals. Some interest groups regard anything less than a maximum effort to tackle climate change and a net-zero carbon future as a social injustice. But an obsession with these objectives has threatened long-held policy objectives, like reasonable and stable utility rates, economic growth and reliable utility service. California and several other states have gone down this primrose road.
Politicization of utility regulation — that is, using regulators to gain favors — means many things, mostly bad; that is, more special-interest influence with the potential to jeopardize the public interest by:
further emphasizing myopic effects;
making more difficult execution of the “balancing act” long held by regulators, with the addition of new interests and social objectives;
parting from the charge of regulation to serve the long-term interest of utility customers;
escalating rent-seeking costs and increasing the likelihood of subsidies and mandates; and
spreading the cost and risk for uneconomical, politically driven investments onto utility customers.
Although politicization does not inevitably mean a negative outcome for society, it typically ends up with one interest group unduly affecting governmental actions that harm the public good.
The culprits are politicians and bureaucrats who envision utilities as “social agencies” by extending their domain beyond a for-profit commercial enterprise. Utilities have had to offer special rates and other concessions to low-income households; accommodate, facilitate and even subsidize their competitors (e.g., net metering) and renewable energy; invest in uneconomic new technologies where cost is subordinate to other factors (e.g., the effect on carbon emissions); subsidize energy efficiency; and achieve clean-air targets beyond federal and local mandates. These demands on utilities, which are costly, have complicated their ability to operate in their proper role as profitable entities providing basic services reliably and economically.
Public utility regulators, like other government entities, are susceptible to rent-seeking efforts by advocates with different agendas to achieve self-serving outcomes paid for by utility customers. The electricity industry in particular has several features that make it highly visible and disposed to politics and interest-group lobbying. The major ones are a substantial environmental footprint, a large user of energy (e.g., fossil fuels), provision of an essential service and high social cost from service interruptions.
As pressures intensified for more new social investments, driven largely by politics and other outside forces, utility regulators have had to wrestle more with the economic inefficiencies of cost socialization and subsidies. Subsidies are especially socially damaging, typically the product of increased politicization; they are:
unfair to funding parties (namely, utility customers);
economically inefficient by conveying false price signals; and
unfair to competing energy sources like natural gas.
One common bizarre practice is for electric utilities to subsidize their customers to use less of their service via EE initiatives; and to subsidize their competitors like rooftop solar. Overall, subsidies almost always fail a cost-benefit test when viewed as a public good.
Because of these developments, regulatory failures and capture have magnified. Historically, capture referred to undue influence by utilities at the expense of their customers and the public interest. More recently, capture has encompassed new stakeholders with the same effect of harming utility customers and the public interest.
This modern-day capture has sprung from the progression of certain interests with utilities protected against financial concerns. We are seeing utility customers being “taxed” with surcharges and “innovative rate mechanisms” guaranteeing that utilities recover their investments directed at the general public, rather than just utility customers. Think of subsidies to clean technologies that reduce carbon emissions, which benefit the whole world. One must ask, why should utility customers alone pay for those investments?
Much of what we see today that passes for the public good really is rent seeking, which benefits a distinct minority at the expense of the majority. Overall, regulators need to think hard about distinguishing truth from virtue. We know from experience that government often rationalizes its actions as morally unobjectionable when in fact it bequeaths handouts to a narrow group at the expense of the public good.
Skepticism is called for when government officials declare, for example, that we would all be better off if we consume less electricity and other fossil fuels and produce more renewable energy. Such claims may be out of sync with what is best for society.
Regulators should ask themselves whether utilities’ primary customers are on the short end of the stick. Are customers funding the advancement of social objectives through inflated electricity rates and even lower service reliability without compensatory benefits? These actions are likely to have a regressive effect by disproportionally burdening below-average income households. For example, the beneficiaries might mostly include high-income households while the payers are households of lower incomes. Think of subsidies funded by utility customers for advancing rooftop solar, electric heat pumps and electric vehicles.
What we see is politics and interest groups driving change toward a clean, lower energy-consumption future, whereas utilities are not necessarily opposed, but demand changes in rate-making and other regulatory practices to protect their financial interests. Regulators, pressured by utilities and advocates of clean energy, have acquiesced and even exhibit zeal about this development. They commonly pass through cost increases and revenue losses to utility customers. Regulators should ask: What are the benefits to the majority of utility customers from “footing the bill” for subsidizing clean energy technologies and EE?
We can make one glaring observation: Special-interest groups are the true catalysts of change, with government cultivating their agendas. Either for ideological or monetary reasons, these groups want to shape the future, and the sooner the better. Their interest encompasses only themselves — not the broader public interest. Their vision of the future entails filling up their pockets or satisfying their favorite doctrine.
Yet the job of utility regulators is to balance the interests of different groups to best serve the public good. That places extreme urgency on state utility regulators, to enforce the “balancing act” that trades off different legitimate interests for the common good — a difficult task, yes, but one that society expects regulators to do.
That naturally leads to the question of whether society requires too much from electric utilities. We expect utilities to maintain financial viability, provide reliable and resilient service, make electricity affordable to all customers, adopt and accommodate new technologies that compete with their core business, decarbonize their generation portfolio, and promote less usage of electricity by their customers. No other private business comes to mind in which society expects firms to tackle such a wide range of social issues.
As an illustration, take the case of utility subsidies for EE. While government officials and utilities won’t admit it, the best evidence shows that their ratepayer-subsidized EE programs are likely to fail a cost-benefit test.
The plain question policymakers should ask is whether the market for EE technologies is free of major “market failures.” If so, one can conclude the marketplace is providing energy consumers with the right incentives to “purchase” EE when it is in their self-interest, and energy consumers are rational and unobstructed in making decisions by market barriers. After all, if society feels consumers are rational in making decisions on what other goods and services to buy, what would cause the same consumers to be irrational when they decide on EE investments for their homes and businesses?
Regretfully, the best evidence has had little effect on utility EE programs because the public is unaware of the transfers; EE is widely popular; and politicians, bureaucrats and utilities can enjoy their support. Utilities gain, for example, goodwill with their regulators without suffering any financial consequences or even profiting as a consequence because of new rate mechanisms like revenue decoupling and a premium rate of return for complying with EE mandates.
We should be mindful of the words of Milton Friedman: “One of the great mistakes is to judge policies and programs by their intentions rather than their results.” For many observers, utility (government-subsidized, as well) EE programs transmit good feelings (i.e., virtual signaling) about using less energy. Instead of expanding the subsidies for EE — which many today advocate for — we should give serious consideration to phasing them out or, preferably, eliminating them all together.
Another example of a misdirected policy is the recent efforts to artificially induce energy consumers to switch from fossil fuels for home use and transportation to electricity, which observers call electrification. Proponents of electrification — notably politicians, bureaucrats, electric utility companies and environmentalists — prefer it to happen sooner than later and be accelerated by subsidies and other governmental enticements. Some even advocate mandated electrification or natural gas bans to prevent hyperbolic climate catastrophes. Many view electrification as essential to combat climate change or even as a free lunch.
As with most things, there are two sides, and electrification is no exception. Most champions of electrification fail to consider, or intentionally ignore, its downsides. A major one is the high cost to households and businesses of converting from natural gas and other fossil fuels to electricity — a cost that can amount to thousands of dollars for an individual home. Another downside stems from the efficiency losses in energy markets from prematurely advancing electrification with subsidies (sometimes funded by utility customers with the approval of state utility regulators) and governmental mandates.
Instead of artificially bolstering electrification with subsidies and mandates, policymakers should allow electric technology to evolve on its own without government support. Technology will determine the ultimate success of electrification — not subsidies and other governmental actions that largely are politically driven to serve special interests.
To conclude, one philosophical inquiry is whether bad policies descend from society’s ignorance of their effects, or do strong-armed and self-interested politics always prevail, regardless of the public interest?
An optimist would say truth will prevail. One perception of truth relates it to policymakers’ decisions that rely on impartial information to balance the interests of different stakeholders for the public good. Yet such optimism appears far from compelling when one observes the myriad policies adopted by society and their consequences for the public good.
Kenneth W. Costello is a regulatory economist and independent consultant.
FERC on July 5 granted a complaint from Dominion Energy to allow planned capacity resources to shift their participation from the Fixed Resource Requirement (FRR) alternative to the Reliability Pricing Model (RPM) capacity market without being subject to a newly instituted notification requirement (ER24-2197).
Dominion argued there’s a disparity between the Dec. 12, 2023, deadline for planned resources — those still in development, but expected to begin operation prior to the start of the delivery year — to notify PJM of their intent to offer into the 2025/26 Base Residual Auction (BRA) and the May 17, 2024, deadline for entities to terminate their participation in the FRR alternative. The utility included 80 MW of planned resources in its 2025/26 delivery year FRR plan prior to notifying PJM it planned to terminate its FRR election April 30, after which the RTO told Dominion those planned resources had missed the BRA participation notification deadline and could not submit offers. (See PJM MIC Briefs: Nov. 1, 2023.)
The company asked FERC to either grant it a waiver from the notification requirement or rule that PJM’s Reliability Assurance Agreement (RAA), when read together with tariff Attachment DD, impinges on FRR entities’ ability to enter the RPM.
The commission determined the notification deadline does not apply to planned resources being constructed by an FRR entity when the deadline passes and that Dominion’s planned resources can be entered into the 2025/26 auction, which is scheduled to be conducted July 17. (See FERC Approves PJM Capacity Auction Delay.)
“As an initial matter, we find that the plain language of Section 5.5 of Attachment DD does not expressly address whether FRR entities at the time of the notice-of-intent deadline are subject to the requirements, including the notice-of-intent deadline, provided for therein,” the commission said. “However, we find that under a sensible reading of the tariff and as a practical matter, the provision did not apply to Dominion’s planned generating capacity resources, as Dominion was an FRR entity, not a capacity market seller, as of the relevant deadline.”
Subjecting FRR resources to a deadline in December would not comport with transitional process the commission approved in PJM’s Critical Issue Fast Path (CIFP) proposal to overhaul its approaches to risk modeling, accreditation and Capacity Performance penalties, FERC said. Recognizing that insufficiency and capacity deficiency penalties were increased in the changes, FERC also greenlit a process for FRR entities to shift to the RPM with at least two months’ notice ahead of the 2025/26 auction. (See FERC Approves 1st PJM Proposal out of CIFP.)
In that order, “the commission explained that PJM proposed to allow FRR entities and their resources to transition from FRR to the auction on only two months’ notice,” FERC said. “PJM’s interpretation of Section 5.5 would prevent former FRR entities from transitioning resources planned in accordance with their FRR obligations to the BRA.”
PJM agreed with Dominion’s argument that the deadlines were misaligned and resulted in unintended consequences for FRR entities seeking to enter the RPM. But it said it could not resolve the issue unilaterally without a commission order.
“More particularly, the mismatch of the deadlines prevent FRR entities, such as Dominion, from effectively participating in RPM auctions by excluding their planned generation capacity resources from participation in the RPM auctions when terminating the election of the FRR alternative, which could result in adverse consequences to Dominion and its ratepayers,” PJM wrote in a June 17 filing. “Additionally, this could also produce inaccurate market signals by not properly reflecting actual demand and supply.”
Dominion stated that its decision to return to the capacity market was in part driven by the increased FRR penalties, as well as the short timeline to adjust to the new requirements following the commission’s approval of the CIFP changes.
“Taking into account the difficulty in satisfying this requirement due to the delivery year being roughly one year away, as well as the significant capacity accreditation reforms and increased penalties for FRR entities approved by the commission and detailed above, Dominion notified PJM on April 30, 2024, that it was terminating its FRR alternative election,” Dominion said in its complaint, filed June 4.
Commissioner David Rosner, who joined FERC last month, participated in the order. The newest commissioner, Lindsay See, did not.
NYISO stakeholders are divided over consultants’ proposal to use a two-hour battery as the peaking plant in the ISO’s capacity market demand curve, as part of its quadrennial demand curve reset for 2025-29.
Comments on the draft report, produced by Analysis Group and 1898 & Co., were due last week. Generators generally were opposed to the proposed proxy unit, while state agencies were in support.
To set the curve, NYISO looks at the gross cost of new entry, the cost of a hypothetical new peaking plant and the likely revenues the plant would earn from participating in the capacity market. The difference between likely cost and likely revenue illustrates what the hypothetical peaking plant would need to earn from the capacity market to support entering the market.
The consultants found that a two-hour battery energy storage system (BESS) “represents the highest variable cost, lowest fixed-price peaking plant that is economically viable,” the report said. “To be economically viable and practically constructible, a BESS would use lithium-ion technology and a modular purpose-built enclosure form-factor.”
The cost of the two-hour BESS assumes a 15-year amortization period and additional costs for capacity augmentation over the life of the battery system to “ensure consistent performance.”
The Independent Power Producers of New York wrote that they strongly oppose the selection of a two-hour BESS as the proxy unit in all locations of the New York Control Area. The IPPNY wrote that two-hour BESS has an inherent limited operating capability, which means it “cannot meet transmission security-based requirements.”
“The Hochul administration says we need 10-hour batteries and up. The NYISO System & Resource Outlook … says we need long-duration storage of four hours and up,” said Richard Bratton, director of market policy and regulatory affairs for IPPNY. “I think that we agree that on a foundational basis that a two-hour battery can’t meet reliability needs for the system, and yet we’re seeing a push for it just because it is the cheapest option.”
Luminary Energy, an energy market consulting firm, recommended the Analysis Group consider “no less than a four-hour BESS or the simple cycle gas turbine” as the proxy unit.
“A two-hour BESS would not be able to mitigate the reliability risks and needs outlined by the NYISO’s Comprehensive Reliability Planning process,” Luminary wrote. “A two-hour BESS would not provide sufficient energy optionality for grid operators to manage volatile and uncertain real-time conditions and presents a high risk to grid operators of depleting the energy from the asset before the most critical systems present themselves.”
The New York Battery and Energy Storage Technology Consortium commented that choosing the two-hour unit would contribute to volatility in the capacity market. The selection potentially would result “in an abrupt drop in capacity prices as the demand curve is determined on a new, lower-cost proxy unit.”
Others questioned the Analysis Group’s appraisal of site leasing costs in New York City. Jones Lang LaSalle Americas, a commercial real estate services and investment firm, commented that the methodology used in the draft demand curve “underestimates the expected site leasing costs.” It recommended using the required rate of return of 7.2 to 7.45%. This would cover the higher site leasing costs for industrial uses required by new generation.
Support, with Some Caveats
The New York Department of Public Service supported the selection of the two-hour BESS as the peaking unit, saying the draft demand curve supported the policy goals of the Climate Leadership and Community Protection Act requirements and the state’s goal of 6 GW of energy storage statewide by 2030.
The DPS did ask that the consultants include revenues and incentives from outside the wholesale market when calculating the net cost of entry for a new peaking plant. The department cited numerous state programs that would compensate clean energy resources for their clean attributes in meeting the state’s CLCPA goals.
Comments submitted on behalf of New York City were more ardently supportive of the draft demand curve and agreed with the selection of the two-hour BESS as the peaking unit of choice.
“Frankly, based on ‘the numbers,’ selection of a two-hour BESS as the proxy peaking unit technology is the clear-cut choice with no close or even obvious alternative,” the city commented. “Moreover, it is beyond rational dispute that a two-hour BESS based on lithium-ion technology is in fact a viable technology.”
The New York Transmission Owners wrote that they agreed with the consultants that the two-hour BESS required less capacity revenue than the other technologies to support its entry to the market and that it had the lowest fixed costs. But they urged the consultants to shift the amortization period to 20 years as opposed to 15, citing the industry’s increased experience with battery storage units.
The introduction to Google’s 2024 Environmental Report begins with a list of the company’s efforts to cut energy consumption and greenhouse gas emissions at its data centers worldwide; for example, Google’s sixth-generation Trillium computer chip is 67% more efficient than its fifth-generation predecessor. The company also has “matched” or offset 100% of its global energy use with renewable energy purchases for seven years in a row and in 2023 signed contracts for an additional 4 GW of renewable power, more than in any previous year.
Such milestones notwithstanding, Google reported a 13% year-over-year increase in greenhouse gas emissions last year, driven primarily by its supply chains and the voracious power demands of the artificial intelligence programs now chewing up electrons at its data centers, the report says.
The company’s 2023 emissions totaled the equivalent of 14.3 million tons of carbon dioxide, up 48% over its 2019 base year, and the report says Google expects further increases “before dropping to our absolute emission reduction target” — net zero by 2030.
The report explains the difference between Google’s assertions of 100% clean energy and its increased emissions in terms of global versus regional accounting: Google tracks its clean energy purchases on a global, annual basis, but the Greenhouse Gas Protocol ― which the company and many other corporations use to track emissions ― monitors on a regional basis.
“In some regions, we purchase more clean energy than our electricity consumption (such as in Europe), while in other regions, we purchase less (such as in the Asia-Pacific region) due to significant regional challenges in sourcing clean energy,” the report says.
Such discrepancies reflect the complicated tradeoffs and uncertainties that Google and other tech giants ― including Amazon, Microsoft and Meta ― now face as AI becomes ubiquitous across almost every sector of the economy and every aspect of daily life. Like Google, Microsoft and Meta have committed to cutting their GHG emissions to net zero by 2030, while Amazon Web Services (AWS) has set a 2040 deadline.
These companies often argue for AI’s potential to cut emissions by optimizing the operation of energy systems, from raising efficiency and cutting electric bills in individual homes to streamlining permitting and interconnection processes to improving visibility across the grid itself.
But realizing that potential comes with a cost: A single AI search can use up to 10 times more power than a standard, non-AI search, which could lead to a doubling of power demand from data centers by 2030, according to a recent report from the Electric Power Research Institute (EPRI). (See EPRI: Clean Energy, Efficiency Can Meet AI, Data Center Demand.)
In the past, increases in data center power demand have been mitigated largely by advances in chip, software and data center efficiency, the EPRI report said. But even with new efficiency measures, like Google’s, the industry is struggling to offset the exponential growth in demand from AI.
Google estimates that in 2023, its data centers used 24 TWh of electricity, or about 7% of the power demand of the world’s data centers, which the International Energy Agency has estimated at 240 TWh to 340 TWh. Overall, cloud and AI data centers represent between 0.1 and 0.2% of global electricity use, the Google report says.
The impact of this increased demand in the United States has become a point of intense discussion across the high tech and electric power industries as more and more states compete to draw in “hyperscale” AI data centers. Historically, power demand for individual “enterprise” data centers has varied from 5 MW to 50 MW; hyperscale centers start at around 100 MW and can exceed 700 MW.
A list of new load additions in development in the MISO service territory includes a pipeline of nine data centers ― including two Google facilities in Indiana ― totaling 5.7 GW.
Getting to 24/7 CFE
Google’s ambitious targets for using carbon-free energy (CFE) make its net-zero goals even more daunting. The company has pledged to power all its facilities with 24/7 CFE ― matching supply and demand on an hour-by-hour basis ― again by 2030. It also is committed to buying clean power that comes “bundled” with energy attribute certificates (EACs), similar to renewable energy certificates (RECs), to ensure it is adding new carbon-free projects to the grid.
Microsoft and other companies, including utilities, sometimes supplement their purchases of clean energy by buying unbundled EACs, which typically come from existing renewable energy projects and may not add new clean power to the grid.
Google now averages 64% CFE at its data centers worldwide, with varying levels of clean energy going to facilities in different grid regions, the report says. Data centers in 10 grid regions — including MISO — are running on 90% or more CFE, while those in the Middle East, Africa and Asia are well under 20%. Total electricity demand at the company’s data centers increased by 3.5 TWh, or 17%, in 2023, the report says.
Beyond making its data centers more efficient, Google also has developed a “carbon-intelligent computing platform” that allows the company to shift computing tasks to other times or locations with more available CFE.
Familiar roadblocks to faster procurement and deployment of CFE have proved harder to shift, including interconnection delays, higher interest rates and development costs, and supply chain backlogs, according to the report. But Google also has become an active partner working with developers and utilities to pilot new business models aimed at untangling some of these problems.
The company partnered with LevelTen Energy, an online energy marketplace, to develop a streamlined process for issuing requests for proposals and negotiating power purchase agreements through standard PPA terms included upfront in the RFP. The new approach has cut the time from RFP to signed PPA from 10 to 12 months to about 100 days, allowing Google to finalize contracts for 1.5 GW of power, according to an announcement on the LevelTen website.
Similarly, the company is looking for ways to de-risk and accelerate the commercialization of emerging technologies that can provide the clean, dispatchable power its data centers need. In June, Google and NV Energy unveiled a “clean transition tariff,” now pending approval by the Nevada Public Utilities Commission. Under the proposed tariff, Google would pay a fixed premium for locally generated CFE ― from an enhanced geothermal project developed by Fervo Energy ― to match demand hour for hour at a Nevada data center.
Google has framed both initiatives as replicable models that can be used in other U.S. or global markets.
Looking to the future, an emerging theme in industry discussions is the need for the responsible use of AI, both socially and environmentally.
Defining “responsible use,” however, will be an evolving and intensely debated target. The Google report notes that the speed of technological transformation driving AI means “historical trends likely don’t fully capture AI’s future trajectory.” Further, as AI is integrated across global economies, “the distinction between AI and other workloads will not be meaningful.”