Monday, October 6, 2014

Crew Resource Management

In aviation a concept was developed in the 1970s known as Crew Resource Management.

The idea behind CRM is that best practices require a team approach: while tasks may be delegated and shared, and one person does have command of the ship, other crew members should act as back up and clear communication should occur.

Prior to the research funded by NASA that led to CRM, the captain was indisputably in command of the aircraft. NASA's research uncovered that the CULTURE around this concept prevented other crew members from speaking up when something was missed or wrong with the captain's actions or decisions.

What NASA's research uncovered was that this culture, that the Captain was not to be questioned, led to failures by other crew members to question the actions of the Captain - flying an airplane is a complex situation and details are critical - miss a speed, an altitude, a bank indication and bad things happen.

In the United States and most other European countries, CRM was much easier to implement because the cultures allowed that practice to evolve. It took time, no doubt, because the old guard that flew under the captain-in-command convention had to retire. Eventually attrition pretty much removed the captain-in-command mindset of flight crews and CRM became standard better practice.

Some countries, however, have a culture where a captain-in-command mentality is deeply embedded, and generational change is not enough to shift the culture.

Vanity Fair in this month's issue ran a lengthy article on the crash of Air France 447, which crashed into the Atlantic in 2009. One of the conclusions of the author is that CRM was thwarted by the French aviation culture, as was evidenced by in-flight cockpit recordings.

One of the failures of Air France 447 was that the crew working beneath the captain (there were two subordinate pilots) did not speak up, or do so with sufficient authority or alarm, to make the captain question his decisions.

Likewise, cockpit recordings reflect that the captain also failed to communicate in clear, concise language, which increased the confusion in the cockpit as the plane plummeted from the sky in an aerodynamic stall.

Contributing to the cultural devolution was the high state of automation of the aircraft. My former next door neighbor retired as a captain for United Airlines after flying B-1 bombers for the Air Force. He would comment on occasion that the big commercial jets were more like flying computers.

Ninety-eight percent of the time flying computers is much more safe than flying manually - the dynamics of flight are so complex and change so frequently that a computer can do the job better than a human can.

Most of the time.

Until something occurs that is outside the machines pre-programmed ability to deal with the situation. Then a human who is skilled at what is called "stick and rudder" flying (how it was done in the old days before auto-pilots) does a better job because the human mind can deal with anomalies.

Unless that human's skills have degraded from over-reliance on automation.

Flying is a depreciating skill - if one does not do it constantly the ability to actually "fly" the airplane degrades.

And this is one of the things that contributed to the Air France 447 disaster - the crew used all the wrong inputs manually flying the airplane because they had become over-reliant on the machine taking care of itself.

The crew was inadequately prepared to actually hand fly the airplane in a CRM environment. By the time the captain had resumed authority panic in the cockpit interfered with communication, decision making and execution of commands.

I tell people that will listen that learning to fly should be mandatory education for professionals and executives because there are three basic skills in flying that are directly applicable to business: planning, decision making and communication. A failure in any one of those skills means death or disaster.

Air France 447 demonstrates what happens when all of those occur simultaneously.

“We now have this systemic problem with complexity, and it does not involve just one manufacturer," Nadine Sarter, an industrial engineer at the University of Michigan, said in the article. "Complexity means you have a large number of subcomponents and they interact in sometimes unexpected ways. Pilots don’t know, because they haven’t experienced the fringe conditions that are built into the system. I was once in a room with five engineers who had been involved in building a particular airplane, and I started asking, ‘Well, how does this or that work?’ And they could not agree on the answers. So I was thinking, If these five engineers cannot agree, the poor pilot, if he ever encounters that particular situation . . . well, good luck.”

Workers' compensation suffers from complexity which begets poor communication, poor decision making and affects planning.

When laws are created to "reform" the system, subcomponents are introduced that interact in sometimes unexpected ways, and system "pilots" don't know how to deal with these changes.

This results in poor communication - different forms are introduced to handle various subsystem requirements that may conflict with the overall direction of the case; poor decision making occurs because variables aren't recognized as potential outcomes; stakeholders (workers and their employers) can't plan because system degradation obscures destination.

At the same time we seem to have become over-reliant on automation in workers' compensation - not in the machine production sort of way, but in a "follow the rules" sort of way. Decision making authority is more removed from the front lines than ever before in favor of pre-programmed, automated responses.

Consequently some cases end up catastrophic when otherwise the outcome should have been routine - something happens along the claim history where the automated systems fail, but the "team" has not been routinely training for those failures, and autocratic controls are too deeply embedded in either culture or practice, and the result is disaster.

This industry does a lot of training on new issues, on new laws, on new techniques.

But we don't do a whole lot of recurrent training on the basics, and there are barriers and walls in the way of challenging autocratic rule that interferes with quality decision making.

The good news is that most of the time cases, just like airline flights, terminate successfully and without damage to person or machine.

The bad news is that when something does go wrong we aren't well equipped to deal with the situation before the case degenerates into catastrophe.

When an emergency occurs in aviation we are taught to aviate, navigate and communicate - in that specific order. Fly the airplane first. That keeps you from getting killed before it's time. Figure out where you're going next so you can land that plane. Then tell someone on the ground (usually air traffic control) what is happening so they can muster as many resources as possible to help out.

The crew of Air France 447 clearly violated all of those basic tenets.

I submit that most "disaster" claims could likewise be avoided by following those basic tenets, if our culture bends to allow that to happen and we train outside of automated systems on a regular basis.

No comments:

Post a Comment