December 24, 2024
NERC Conference Ponders the Human Element to Reliability
NERC offered lessons from firefighters, psychologists and the airline industry at its eighth annual Human Performance Conference, which attracted more than 400 attendees from 180 organizations.

By Rich Heidorn Jr.

ATLANTA — When NERC considers a new reliability standard, it convenes drafting teams heavy on engineering expertise and system operations. But for its eighth annual Human Performance Conference last week, NERC brought in firefighters, psychologists and speakers from the airline industry to provide lessons.

“We find that all industries have people, people who actually operate the same way, whether they’re moving electrons or moving aircraft,” explained NERC Vice President and Director of Reliability Risk Management James Merlo, who served as master of ceremonies for the three-day event, co-hosted by the Department of Energy and the North American Transmission Forum (NATF).

The conference attracted more than 400 attendees from 180 organizations, including linemen, control room operators — and at least one utility billing analyst, who said he attended because his company uses root-cause analysis on billing errors.

NATF HP Assistance Visits

At the beginning of the conference, NERC CEO Jim Robb signed a new memorandum of understanding with NATF, whose 7,400 subject matter experts conduct peer reviews to promote “excellence and continual improvement.”

The nearly 90 companies in NATF operate 80% of transmission of 200-kV and higher. “It’s an impactful set of members,” NATF CEO Tom Galloway said. “If we get the forum oriented on a topic, we can typically move the ball forward pretty well.”

Representatives of MISO, Arizona Public Service and Tri-State Generation and Transmission Association shared their experiences with inviting NATF to visit.

MISO’s John Rymer, who formerly worked in transmission substation operations for Duke Energy, said he asked NATF to help transfer human performance (HP) tools from the field to the RTO’s control room. “These assistance visits really help you pinpoint where you need to concentrate your efforts,” he said.

Rymer said he is looking for “low-hanging fruit” in spreading the HP message in real-time and control-room operations first. “You can move that upstream or downstream in our organization, because engineering, IT, all these groups, can utilize the same tools,” he said.

Sage Williams, manager of Tri-State’s Eastern maintenance region, agreed. “HP can affect not just field guys. It’s your entire organization: engineering, system operations,” he said.

Sharing Mistakes, Using Technology

Digger derricks — utility trucks with augers for digging holes for poles and boom-mounted hydraulic lifts for working on wires — played supporting roles in stories by several speakers who shared mistakes they made as linemen.

In a talk titled, “How strong character improves safety and reliability,” former lineman Jeff White, now an HP consultant with Applied Learning Science, recounted how his truck flipped on its side when soft ground gave way underneath its outriggers while he was installing a pole on a new golf course.

After the crew used a second truck to get the first one back on its tires and wiped off the mud, the lead lineman told White and the other crew members not to report the incident. “‘This was a nonevent. What you just witnessed, erase from your memory.’”

White initially agreed, but after a sleepless night, he told his foreman of the incident the next day, fearful that the truck might have sustained unseen damage that could result in an injury to another worker.

“What if the bolts on that turntable — half of them are broken and we can’t see them? What if there’s factures in that steel that we can’t see? I need to speak up,” White said.

In another presentation, MidAmerican Energy displayed 3D recreations of field accidents, crafted by 3DInternet.

Mike Buntz, a MidAmerican line crew foreman, narrated an animation of a near tragedy that occurred while replacing a rotten utility pole. The truck became fully energized when its boom contacted an overhead wire, setting the grass around it on fire. Luckily, no one was injured.

“It wasn’t one of my prouder moments,” Buntz said. “But I agreed [to participate in the animation] hoping that I could help somebody down the road.”

“One of the things that we learned about using these 3D animations is it helps us to have a better appreciation of what actions made sense at the time … what happened and why,” said Sam Reno, MidAmerican’s performance improvement program manager. MidAmerican also is using GoPro cameras mounted on hardhats to produce training videos.

Peter Jackson, an HP coordinator for Georgia Power, said utilities often have “brittle systems” that assume 100% error-free performance.

He demonstrated a pilot program using visualization technology that turns an iPad into a situational awareness tool that shows real-time data on substation equipment’s health and other metrics.

Jackson said the tools can help utilities deal with the loss of experience as more of their aging workforce retires.

“We think that this really helps our guys build a deeper knowledge of the tasks and how to do it right,” he said. “We also think there’s an application potential for everything from troubleshooting to [simulations of] high-risk tasks.”

Answers from the Field

Michelle Miller and former colleague Monika Bay recounted their efforts to improve worker safety at Baltimore Gas and Electric after the electrocution of a worker at a substation.

“You can have the best intervention design in the world, but if you cannot convey it in a way that connects with the head and the heart of these front-line employees you will not be successful,” said Bay, who left BGE a year ago to start her own company, Safety & Operational Risk Solutions.

Bay said sustaining improvement “is by far the hardest part. This is really about line leaders keeping the language alive — keeping these concepts alive in their own workgroups.”

After her and Miller’s work, workers in the field started bringing risks to them, Bay said.

One such issue: the potentially fatal consequences of confusing a black, yellow-striped electric line with a nearly identical black, yellow-striped, three-quarters-inch, high-pressure, plastic gas pipe.

At BGE, gas lines are supposed to be buried 2 feet below ground, with electrical lines a foot below them. In practice, however, the lines can get transposed, meaning a worker expecting to cut a gas line could end up getting electrocuted by cutting the electric cable. The only apparent difference between the two: The gas pipe has four stripes; the electric cable only three.

“It’s 2 o’clock in the morning, I’m a gas mechanic, I’m 2 feet down in the hole, it’s raining and muddy, and I’m going to tell the difference?” Bay said.

“It wasn’t just this: We had 16 pairs of assets where the gas and electric looked very similar.”

Bay said BGE’s Executive Safety Council, which included its chief operating officer, was “horrified” by the disclosures.

“System design often puts risk into the hands of the employees,” she said. “Sometimes we don’t know some of the risk that front-line guys are dealing with because they’re just dealing with it.”

Lessons from the Airlines

Christian Vehrs of Delta Air Lines described how an aircraft maintenance employee confused a fuel heater valve with a nearly identical engine anti-icing valve because of a faulty manual and his own confirmation bias.

“These manuals are 20 years old, and we’re still discovering mistakes,” he said.

David Marx of Outcome Engenuity used dice to illustrate resilient systems. The more dice you roll, he said, the more redundancy — like setting multiple alarm clocks to prevent oversleeping.

The redundancy provided by multiple “dice” is essential in aeronautics, he said, because while Federal Aviation Administration rules require a 1 in 1 billion chance of failure, “nobody can design a part that will never fail.”

Marx used the experience of pilot Chesley “Sully” Sullenberger, who famously landed his Airbus A320 on the Hudson River when his engines failed after striking a flock of Canada geese shortly after takeoff from LaGuardia Airport in 2009.

Airline engines are expected to fail only once in every 50,000 hours. Because planes must have two engines, the chances of both failing simultaneously should be 50,0002, or 1 in 2.5 billion, Marx noted.

In Sullenberger’s case, however, LaGuardia was near a landfill that attracted birds, meaning the dice were “stuck together,” Marx said. (Airport officials increased their bird-killing programs after the incident.)

In another example, Marx cited a woman who died in 2017 after mistakenly being given a paralytic, vecuronium, instead of the mild sedative midazolam — marketed under the brand name Versed — that had been prescribed for her during a PET scan.

Hospital procedures set four “dice,” starting with an automated dispensary stocked with the drugs. A nurse mistakenly chose the wrong drug when the autocomplete function gave her options after typing the letters “V-E.”

The nurse then failed to check the drug at the dispensary or later when she delivered it to the patient. Finally, the fourth die was the nurse’s failure to remain with the patient to monitor her reaction to the drug.

“What should have been four dice became one,” he said — the active failure of choosing the wrong drug compounded by the nurse’s failure to perform the other three safeguards. The nurse is now facing reckless homicide charges.

Marx said the incident illustrated why many of us ignore speed-limit signs but slow down when we see a police car.

“We are not inherently rule followers; we are hazard and threat avoiders,” he said. “The police officer represents consequence. The sign just represents the rule.”

Lessons from the Football Field

Dave Sowers of Knowledge Vine used a video of the play known as the “Prayer at Jordan-Hare” — Auburn University quarterback Nick Marshall’s unlikely 2013 game-winning touchdown pass over the University of Georgia — in a discussion on the role of luck.

It was 4th and 18 with 36 seconds left in the fourth quarter. Auburn Head Coach Gus Malzahn called for a pass to get the first down, which would have put the team into field goal range. And the intended receiver was wide open as a second receiver went deep, drawing triple coverage. Marshall unwisely threw the ball to the deeper receiver, but two of the Georgia defenders collided, one tipping the ball into the hands of the receiver, who ran into the end zone untouched.

Marshall was lucky in that instance, but his gunslinger judgment ultimately proved his undoing, as too many of his passes ended up intercepted. “That’s why he’s not playing on Sundays” in the NFL, Sowers said. Instead, he plays for the Canadian Football League’s Saskatchewan Roughriders.

HP in the Control Room

Mohammed Alfayyoumi recounted the changes he made since becoming director of Dominion Energy’s transmission system operations center.

He spread out the workload by scheduling switching orders throughout the week rather than having them all on Mondays. He doubled the operations staff to four per shift after benchmarking Dominion’s staffing against similar-sized utilities.

He also increased simulator training, began near-miss reporting and training in root-cause analysis, and eliminated work that didn’t add value by automating 6,000 phone calls per week.

Operator hiring was improved by adding testing and screening, including more complex behavioral interviews.

“We invest a lot in steel and copper but not a lot in humans,” Alfayyoumi said. “Operator selection is vital to human performance, because you cannot fix poor selection. If you hire the wrong operator, there’s nothing you can do to make him better,” he said.

Adaptive vs. Procedural Systems

Consultant Jake Mazulewicz made the case for “adaptive” over “procedural” systems, recounting a conversation with an employee for an unnamed company who complained it had become excessively dependent on procedures.

“He said when an incident happens, even something small — someone cuts themselves with a knife, no stitches, very small stuff — everybody hears about it. And within two or three weeks you can bet your next paycheck that [the] safety and training [department] is going to say … ‘Here’s a new and updated procedure to make sure that never happens again.’

“Nobody even bothers reading the new procedures … because it doesn’t matter, because it’s going to change,” Mazulewicz continued.

Under system-based thinking, he said, “when you see an error … you don’t think who’s wrong, you think that’s a signal that my … system could use some improvement. … Almost every incident we’re talking about is triggered by human error, but it’s caused by a whole lot of other things: latent organizational weaknesses, previous decisions, things like that.

“How do you minimize errors in a system-based thinking? You improve your system. … You make it hard for people to do the wrong thing, and you make it easy for people to do the right thing.”

NERC & Committees

Leave a Reply

Your email address will not be published. Required fields are marked *