Wednesday, June 30, 2010

An astronaut speaks

Kathryn Schulz, author of Being Wrong, Adventures in the Margin of Error, sent along this link to a recent Slate interview with James Bagian, director of the VA's National Center for Patient Safety. Bagian, a former astronaut, had some great observations about improving quality and safety in the health care environment. Some excerpts that I like:

You can't change the culture by saying, ‘Let's change the culture.' It's not like we're telling people, "Oh, think in a systems way." That doesn't mean anything to them. You change the culture by giving people new tools that actually work. The old culture has tools, too, but they're foolish: "Be more careful," "Be more diligent," "Do a double-check," "Read all the medical literature." Those kinds of tools don't really work.
One thing we do that's unusual is we look at close calls. In the beginning, nobody did that in healthcare. Even today probably less than 10 percent of hospital facilities require that close calls be reported, and an even smaller percentage do root cause analyses on them. At the VA, 50 percent of all the root cause analyses we do are on close calls. We think that's hugely important. So does aviation. So does engineering. So does nuclear power. But you talk to most people in healthcare, they'll say, "Why bother? Nothing really happened. What's the big deal?"
In theory, punishment sounds like a good idea, but in practice, it's a terrible one. All it does is create a system where it's not in people's interest to report a problem.


Anonymous said...

A fascinating interview. I always wondered why the Joint Commission doesn't require root cause analysis of near misses - after all, their sentinel event definition includes death or serious physical injury, "or the risk thereof."

But if one uses non-investigation of near-misses as a surrogate indicator for a non-safety culture, that means 90% of hospitals have yet to have a culture of safety - a disturbing figure, indeed.

The one issue on which I disagree with him is caring for practitioners after a medical "error" - his quote:

"In my opinion, it's a nice thing to do, but it's not the major issue. Quite honestly, I think: "Get over it and grow up." I come from aviation, and we don't have pilot support groups."

may originate from the fact that pilots are usually included among the victims of their own errors, therefore they are not around to be supported.


Anonymous said...

That's an excellent introduction to the changes that are needed.

It also helps to include an explanation that the vengeance based quality control that is presently used is not inherently stupid. People are not fools for using vengeance as a response to serious errors. They are using a tool that is effective against a different and older problem. Vengeance is an effective response to sociopathic behavior. The current quality and malpractice system is a natural evolution from the controlled vengeance of the criminal legal system. That evolved to replace the socially limiting mechanisms of vendetta and warfare of the dark ages and earlier cultures.

Our problems in healthcare are not generally the result of criminal or sociopathic behavior, so these vengeance based systems should be put into a secondary role reserved for the infrequent criminal and sociopathic acts.

Anonymous said...

Root cause analysis is only half the battle. The organization then needs to implement solutions based on the root causes found.

My option is that many organizations do not indentify root causes on near misses because they do not really know how to identify a root cause, or they have been ineffective in developing or implementing solutions. Perhaps a root cause of not looking for root cause is outdated leadership practices.

Kudos to the VA.

Anonymous said...

Anon 12:01;

The Joint Commission has the following expectations of an institution following a sentinel event:

"When a sentinel event occurs, the accredited organization is expected to conduct a timely, thorough and credible root cause analysis; develop an action plan designed to implement improvements to reduce risk; implement the improvements; and monitor the effectiveness of those improvements."

Therefore, unless a hospital has been lucky enough to never have had a sentinel event (unlikely except perhaps in small institutions), the policy and procedure for investigating and implementing solutions should already be in place, and could easily be applied to near misses.

My own suspicion is that defining a "near miss" is problematic to some extent and allows concern over $$ allocated to such activities to prevail, particularly with uneducated leadership.

However, as to your question of effectiveness, therein lies another rub. But that's no excuse for not even trying, is it?

nonlocal MD