The time has come to drive a stake through the heart of an oft-repeated assertion.
How often have you heard something like the following when those of us in healthcare who want to stimulate quality and safety improvements draw analogies to the airline industry?
"Well, in an airplane, the pilot has an extra incentive to be safe, because he will go down with the ship. In contrast, when a doctor hurts a patient, he gets to go home safe and sound."
At a recent talk to medical residents, medical students, and nurses in training, Terry Fairbanks (Director of the National Center for Human Factors Engineering in Healthcare) put the opposing case forward. He noted, "No pilot follows safety rules and procedures because he thinks he is otherwise going to crash."
Likewise, I would note, no doctor fails to follow safety rules and procedures because s/he does not care about the well-being of a patient.
What is the difference, then? Terry summarizes, "There is a pervasive safety culture and set of rules that guides airplane pilots, based on a human factors approach."
He added, "The relative degree of accountability (compared to other industries) is not the underlying cause of medical errors."
Being in the human factors business, Terry is a whiz at the physical conditions and cognitive errors that bring about harm, and also at the interventions that can help reduce them. He notes that most errors are skill-based errors, or errors that occur when you are in automatic mode, doing tasks that you have done over and over--indeed tasks at which you are expert.
He explains, "When you are in skills-based mode, you don't think about the task you are about to do. Signs don't work! Education and labeling don't work when you are in skills-based mode. Most medical errors are in the things we do every day."
Accordingly, vigilance and training are not the answer to skill-based errors. Neither is punishment:
"While discipline and punishment has a role when there is reckless behavior, applying discipline to skill-based errors will drive reporting underground and will kill improvement."
Many hospitals approach safety improvement in the wrong way because "most safety-based programs are based on work as imagined, not work as actually done. We need to design our improvement based on real work, not on the way managers believe how work is done."
Interestingly, Terry asserts,"If we just focus on adverse events, we will not make significant progress on creating a safer environment."
Also, he warns: "Don't base safety solutions on information systems. Humans add resilience: Computers do not adapt."
The airlines have noticed this and have adopted solutions that are attuned to cognitive errors. Recall this summary from Patrick Smith:
We’ve engineered away what used to be the most common causes of catastrophic crashes. First, there’s better crew training. You no longer have that strict hierarchical culture in the cockpit, where the captain was king and everyone blindly followed his orders. It’s team oriented nowadays. We draw resources in from the cabin crew, people on the ground, our dispatchers, our meteorologists, so everyone’s working together to ensure safety.
The modernization of the cockpit in terms of materials and technology has eliminated some of the causes for accidents we saw in the ’70s into the ’80s. And the collaborative efforts between airlines, pilot groups and regulators like the Federal Aviation Administration and the International Civil Aviation Organization, a global oversight entity, have gone a long way to improving safety on a global level.
Here's more about the Commercial Aviation Safety Team, through which virtually anyone who sets foot in an airplane, touches it, or monitors its travel is expected and empowered to submit a report about potential safety hazards.
In summary, it is not the personal risks faced by doctors compared to pilots that kill and harm patients. It is the fact that the kinds of solutions needed in health care are just at the gestational stage. Facile comments that doctors don't care as much as pilots are just plain wrong and divert attention from the steps that can and should be taken to learn from the airline industry.
How often have you heard something like the following when those of us in healthcare who want to stimulate quality and safety improvements draw analogies to the airline industry?
"Well, in an airplane, the pilot has an extra incentive to be safe, because he will go down with the ship. In contrast, when a doctor hurts a patient, he gets to go home safe and sound."
At a recent talk to medical residents, medical students, and nurses in training, Terry Fairbanks (Director of the National Center for Human Factors Engineering in Healthcare) put the opposing case forward. He noted, "No pilot follows safety rules and procedures because he thinks he is otherwise going to crash."
Likewise, I would note, no doctor fails to follow safety rules and procedures because s/he does not care about the well-being of a patient.
What is the difference, then? Terry summarizes, "There is a pervasive safety culture and set of rules that guides airplane pilots, based on a human factors approach."
He added, "The relative degree of accountability (compared to other industries) is not the underlying cause of medical errors."
Being in the human factors business, Terry is a whiz at the physical conditions and cognitive errors that bring about harm, and also at the interventions that can help reduce them. He notes that most errors are skill-based errors, or errors that occur when you are in automatic mode, doing tasks that you have done over and over--indeed tasks at which you are expert.
He explains, "When you are in skills-based mode, you don't think about the task you are about to do. Signs don't work! Education and labeling don't work when you are in skills-based mode. Most medical errors are in the things we do every day."
Accordingly, vigilance and training are not the answer to skill-based errors. Neither is punishment:
"While discipline and punishment has a role when there is reckless behavior, applying discipline to skill-based errors will drive reporting underground and will kill improvement."
Many hospitals approach safety improvement in the wrong way because "most safety-based programs are based on work as imagined, not work as actually done. We need to design our improvement based on real work, not on the way managers believe how work is done."
Interestingly, Terry asserts,"If we just focus on adverse events, we will not make significant progress on creating a safer environment."
Also, he warns: "Don't base safety solutions on information systems. Humans add resilience: Computers do not adapt."
The airlines have noticed this and have adopted solutions that are attuned to cognitive errors. Recall this summary from Patrick Smith:
We’ve engineered away what used to be the most common causes of catastrophic crashes. First, there’s better crew training. You no longer have that strict hierarchical culture in the cockpit, where the captain was king and everyone blindly followed his orders. It’s team oriented nowadays. We draw resources in from the cabin crew, people on the ground, our dispatchers, our meteorologists, so everyone’s working together to ensure safety.
The modernization of the cockpit in terms of materials and technology has eliminated some of the causes for accidents we saw in the ’70s into the ’80s. And the collaborative efforts between airlines, pilot groups and regulators like the Federal Aviation Administration and the International Civil Aviation Organization, a global oversight entity, have gone a long way to improving safety on a global level.
Here's more about the Commercial Aviation Safety Team, through which virtually anyone who sets foot in an airplane, touches it, or monitors its travel is expected and empowered to submit a report about potential safety hazards.
In summary, it is not the personal risks faced by doctors compared to pilots that kill and harm patients. It is the fact that the kinds of solutions needed in health care are just at the gestational stage. Facile comments that doctors don't care as much as pilots are just plain wrong and divert attention from the steps that can and should be taken to learn from the airline industry.
4 comments:
Thanks a lot Paul for your insightful article about the health system.
It brings up to my mind a quote (repeated in my own words) from Jay W. Forrester about the most important person in an airplane. "What is the most important person in an airplane?" Of course the pilot plays a vital role, however there is a more important one: the plane designer!"
We can't put out of the discussion about the quality of the health system that it has been designed in large not by the practicing doctors but other stakeholders who had their own intentions. Now changing the design, is like switching the landing gear, electronics or wings during full-speed airtravel, while paying customers are on board.
Makes not much sense does it?
So what is the conclusion?
Take the system down, prototype in a way that has the intended outcome, and get this learning spread into new practices, and get stakeholders, mainly the patients (who are the paying customers) on board.
Not an easy task, but as you have shown with BIDMC and leading it out of the almost bankrupt stage - it is possible step by step (with challenges along the way).
Cheers, Ralf
PS.: May I post article on my group http://xing.com/net/lean ?
Of course! Thanks.
This is a great article and very timely as I continue to see many hospitals blame individuals for errors without realizing the harm they are doing to patients and their organizations. I'd add a couple points that address the increased biologic complexity of healthcare compared to aviation. First, the "flight" in healthcare is not a procedure in the operating room, it is the entire cycle of care for a patient's problem (which can be months or years). For safe care coordination, the team must be there throughout- not just in the OR, or even for just the hospital stay, but for the patient's entire cycle of care (a new system design for patient care as Ralf described above). Second, there is the reality of biological complexity in healthcare- the same treatment that helps one person might harm another, so standardized (linear and not adaptable)care processes wont work in healthcare- they must multi-optional and dynamic. By applying complex systems data analytics over time, we will be able to better predict best treatment options for sub-populations of people, but that will take time. For now, we should let the patients decide what they think is the best care process for them. This is now just being recognized with cancer screening.
I've always thought something was wrong with this facile assertion. You've inspired me to tackle the next facile assertion, which is that 75% of cost is caused by the 50% of people with chronic disease so that if we can make people healthier we can save a lot of money
Post a Comment