Wednesday, July 13, 2011

Attack the problem, not the people

Back in May, I wrote about an error regarding a kidney transplant at UPMC, where a surgeon was demoted and a nurse was suspended. I wrote, "Guessing from a press report, I am betting that this story is not totally a person problem," suggesting instead that there were systemic problems at UPMC that underlay the error.

Now, the Pittsburgh Post-Gazette offers support for this view. Sean Hamill reports:

The positive hepatitis C test that was missed at UPMC, leading to the shutdown of its living donor kidney transplant program, was not noticed by its entire transplant team despite a highlighted alert in the hospital's electronic records system.

"Everyone just missed it," a source with knowledge of the case said.

The alert was missed by as many as a half dozen people on the transplant team who typically would have reviewed such a test result, according to interviews with several current or former UPMC employees. A number of those interviewed said the problem lies more with the larger system of ensuring that medical errors are caught than with the individuals involved in the incident.

Here's my unsolicited advice to people at UPMC: Attack the problem, not the people.

8 comments:

  1. It's absolutely incredible they would fire people based on this reason for the error, which somehow just came to light (I wonder how?). As a pathologist I can say categorically that missing a positive result on an important lab test happens every day in every hospital in the country. If there were ever an error that cries out for a root cause analysis and system improvement, this is it.(I would mention the word 'checklist', but that would be short-circuiting the root cause analysis process).

    What I would like to know, instead, is who made the bone-headed decision to take this adverse personnel action. THAT's the one who needs retraining.....

    nonlocal MD

    ReplyDelete
  2. Ok. I'll be the contrarian. Sure, the ideal system design would preclude a whole medical team from neglecting a critical signal. The ideal system would make my car stop at the yellow light rather than those twitching muscles in my foot gunning it through the red.

    So many of the complex signals in medicine assume too much predictability in human behavior (possibly a function of those who design the fixes). And there is very little study of variation - and vagarity - in provider behavior, and yet it is ubiquitous. Where is the RCA that says "provider never talked to staff" or "provider eyes on chart less than 4 seconds" or "signoff without discussion"?

    Firing is extreme, and waste of intensive human resources, unless a pattern of repeat negligence is found. There should be, however, a salient and serious response for failures to protect a patient, including requiring significant participation in system redesign (consider it re-education, if you must).

    Accountability is not a dichotomous choice. Shaming doesn't elicit the kind of response required to redesign medicine. But there is need for a big shift in provider behaviors such that leadership positions are afforded to those who 1) actively reduce dominance behaviors (which studies show suppress safety behaviors), 2) consistently look for and report novel and repeated errors and near misses (they happen everyday, do not make the voluntary reporting system and are not amendable to chart review), and 3) encourage others around them to think about their work differently than they learned it in school. When these go on performance reviews in a more than perfunctory way, and new status hierarchies based on safety and patient-centeredness behaviors emerge in the system, we'll know that we are getting somewhere.

    ReplyDelete
  3. Anon 7:53, I agree entirely. I don't see your comment as contrarian, just addressing a wider issue - that of creating a safety culture and reinforcing practice of that culture while disincentivizing failing to practice it (accountability).
    This is, of course, where you want to go, but who has to lead it?? The CEO - whose minions in this case probably made the personnel decision.

    My comment addressed a narrower issue - a process for analyzing error and creating a system designed to minimize it. This will be necessary even in an ideal safety culture.

    The problem is, neither the narrow nor wide exists at this institution, apparently.

    nonlocal

    ReplyDelete
  4. I agree, your 100% right.

    Thanks for the post.

    ReplyDelete
  5. So, let's say we have public reporting.

    Then, let's say that we're all pretty much behaving under the same assumptions and rules as we did before public reporting.

    Why would I report?

    ReplyDelete
  6. Perhaps things wouldn't stay the same . . .

    ReplyDelete
  7. Having multiple people overlook a key piece of information resulting in "medical error" is clearly not a new problem. Shaming and scapegoating does not seem to work well. I think Anon7:53's comparisson to the Yellow Light which results in an extra burst of effort to safely avoid calamity is all too common post System Redesign. The system is excellent (EMR flags, yellow light means prepare to stop) but irrelevant if not followed. Would it be possible to introduce carrots to resolve the problem? What if staff were rewarded for catching oversights that would have resulted in a problem? Off the top of my head, the MRP and other "first-eyes" people would be exempt but the reward would increase the closer the error got to the patient. e.g. Each time the pharmacist or nurse corrects a dose by an order of magnitude.

    ReplyDelete