Sunday, March 23, 2014

It could have happened to anybody

My rule of thumb is that when an error is made and you can say, "It could have happened to anybody," there is a systems problem behind the error.  Here's a story that demonstrates this so clearly, courtesy of our friends at MedStar Health.

As you watch the video, imagine the more common scenario in hospitals, where the clinician is blamed and where the underlying problem goes unsolved.

Thanks to Annie for sharing her story!

6 comments:

Stephen Borstelmann MD said...

Paul,

That’s a great share. And you are correct to point out that its far more common to have the clinician blamed and the underlying problem unsolved. Just because humans are fallible does not mean technology is infallible. And we are probably early in our understanding of the nuances of inter-related complex processes like we have in healthcare. As we go boldly into the era of big data and real-time predictive analytics, there will be no substitute for experience and common sense avoiding further system type errors.

P.S. Is this a nursing recruitment video for MedStar? If I were a nurse, after watching that, I would probably want to work there.

Mary K Parker said...

Fantastic video. We were just discussing PSRs in our staff meeting today and how there doesn't seem to be a good feedback process in place for the multitude of PSRs that are submitted. I know that in many instances, I never hear the outcomes of the PSRs, what was implemented, what was changed, or even if anyone read my PSR submission.


I started my own "PSR Tracker" that documented the date of submission as well as the date of the event with the tracker number and added a general comment of the topic so I could trend events on my own unit, but that would only capture those PSRs I submitted or those that were referred to me. I have no idea what happened to the rest of them.

Chris Meeker said...

I am disappointed that the device display was not addressed in the video at all. The device is part of the system as well part in this case. I am left with the impression that the "solution" was simply another workaround for front-line staff to implement: "Remember not to trust error messages on this glucometer." Real system solutions would create system ownership and resolution of problems, not offload them onto front-line staff.

Anonymous said...

I second, third and fourth Chris' comment, as a former hospital lab medical director in the era when point of care devices were just coming into vogue. Unfortunately point of care devices are not nearly as perfect as people like to think, but that point is getting lost outside the laboratory.

nonlocal MD

Anonymous said...

I dont understand how the nurse "misread" the glucometer. She read what was displayed. Instead of a system's failure, why not simply equipment failure.

just-a-patient

Terry Fairbanks, MedStar Health said...

Clarification from MedStar Health: We appreciate the interest in Annie’s story and want to address the excellent comments. The short video sharing Annie’s story is just one piece of the story--- intended to convey the critical point about the impact on providers and safety culture when the traditional ‘blame game’ approach is used. At MedStar Health we are undergoing a transformation in safety to an all encompassing systems approach. Our senior leaders are all on board, from the c-suite to attorneys to risk managers to hospital leaders. But we have 30,000 associates, and as we roll this program out, we need a way to convince everyone about the impact this has on our healthcare workers on the front lines.

Although as a whole healthcare likes to think we are following the systems approach, in most cases we really are not. We often fail to find the true contributing factors in adverse events and in hazards, but even when we do, we frequently employ solutions which, if viewed through a lens of safety science, are both ineffective or non-sustainable. Very often, events that are facilitated by numerous system hazards are classified as “nursing error” or “human error,” and closed with “counseling” or a staff inservice. By missing the opportunity to focus on the design of both system and device factors, we may harm individuals personally and professionally, damage our safety cultures, and fail to find solutions that will prevent future harm. It is the damage to the healthcare provider and culture that this video was intended to highlight.

The great majority of caregivers involved in a harm event are devastated about what has happened, yet for far too long many have had to navigate this storm alone. It is up to us to demand that a systems approach be a given in our healthcare workplace, along with the just culture that cultivates the sharing of knowledge and helps prevent patient harm from occurring altogether.

To tell Annie's story, we chose to focus on the main theme--the human cost to our healthcare workforce when we fail to cultivate a just culture and systems approach to any unfortunate harm event. There is much more to this story, as readers point out. You will be happy to know that the patient fully recovered, that Annie is an amazing nurse and cultural leader, the hospital leaders apologized, and all glucometers within our system were changed to reflect clear messaging of blood glucose results. We believe we have eliminated the hazard that would have continued to exist if we had only focused on educating, counseling and discipline. We also communicated the issue directly to the manufacturer, and presented the full case in several venues, in an effort to ensure that this same event does not occur somewhere else.

This event, which occurred over three years ago, gave us the opportunity to improve care across MedStar Health’s ten hospitals because of the willingness of our healthcare providers to ask for help figuring out a threat to our system and also because leadership followed their instincts—that good healthcare providers should not be punished for system failures. Thank you all for raising these important issues, and for making us realize we needed to tell the rest of the story. We have updated the YouTube description as well.

And, thank you Paul Levy for opening up this discussion.

RJ (Terry) Fairbanks, MD MS, Director, National Center for Human Factors in Healthcare, MedStar Health
Tracy Granzyk, Director, Patient Safety & Quality Innovation, MedStar Health
Seth Krevat, MD, Assistant Vice President for Safety, MedStar Health
David Mayer, MD, Vice President for Quality and Safety, MedStar Health