Monday, July 14, 2008

Lessons from "never" events: Mental model shifts

In a post below, I mention some lessons from Tom Botts at Royal Dutch Shell that he and his senior team learned after a serious accident on one of their drilling rigs, lessons that made them rethink their approach to safety. Since then, I've had a chance to exchange a few emails with Tom, and he was very kind to send me an annotated version of major insights resulting from his experience. I post it below. I think there are lessons for all of us here at BIDMC and in other hospitals, as well. They are especially pertinent here as we follow up from our "never event", the wrong-side surgery, and as other hospitals watch and learn from our experience.

As we work through how to improve ourselves, Tom's insights offer guidance and warnings. They are quite potent. I am particularly attuned, though, to Number 5, as that is the usual response when something goes wrong; i.e., "bolting on" a new rule or procedure, which then creates a new layer of error-producing problems of its own.

Tom’s mental model shifts as a result of the Brent Bravo fatalities

1. Good results may not reflect underlying performance
Just before the incident, the Brent Bravo platform’s safety and operating performance was very good, as measured by our normal key performance indicators. The operating performance dashboard which listed ‘traffic lights’ for all of our key performance indicators was mostly green, which gave the appearance of an operation in control and performing well. Key question: Do the metrics I am looking at really indicate the underlying performance? Or are they giving me false comfort?

2. Challenge the green, and support the red
Related to 1. above. If on measurement dashboards, we focus on challenging red lights and praising green lights, pretty soon all the traffic lights will be green. Key question: Are the green lights really green? Am I in too much of a hurry to ‘fix’ the red lights, instead of really trying to understand what they are telling me?

3. At the level of rhetoric, there are no dilemmas
It’s easy for senior leaders to stand up and claim “safety is our most important priority—I do not want you to compromise safety”. And then we leave the people on the shop floor to deal with all the dilemmas of cost, schedule, production, etc. Key question: How do I acknowledge and help people work through the dilemmas they face every day? Or do I leave it up to them to grapple with the tough choices?

4. Lurking in the wall of noise are critical messages
During times of change, there will inevitably be a lot of feedback expressing concern over the change. A key for senior leaders is to resist the urge to dismiss the “noise” and chalk it up to “they just don’t want to change”. Key question: How do I really try to understand the concerns that people have and use that to deliver an even better product? Do I effectively play back the concern so people feel they have been heard?

5. Bolting on best practices may make the system worse

We love to identify and apply best practices. We tell our people to stop reinventing the wheel and find someone who has already solved the problem. Nothing wrong with that, but adding stuff onto the system without fully understanding the system impacts can result in worse performance, not better. Key question: Have I fully considered the unintended impact of applying this “best practice” change to my system?

6. The operations professionals may not see it either
As a senior leader, I assumed the professionals on the shop floor would have the knowledge and the empowerment to stop operations if they felt it was unsafe. We have developed incredibly complex systems, and it’s hard for even the experienced professionals to know if they are operating “outside the envelope” or not. Key question: Do the people at the shop floor making the day to day decisions have the competences and deep understanding of the system they are operating to know when to say ‘STOP’?

7. I’m enrolling somebody in something every minute
We know, as senior leaders, that all eyes are on us and it’s especially important our messages are consistent and well thought through. But we aren’t enrolling people only when we are giving speeches or making presentations. We enrol people with every word and action. Key question: What do I do at the coffee pot, or in idle chatter with staff, or when I think I’m having a private conversation that may enrol people in something different than my “public” messages?

8. A system full of well intended, competent people working world class systems trying their best to meet expectations can produce fatalities
Probably the most profound learning for me. In the Brent Bravo story, there were no obvious ‘villains’, but rather a number of causal patterns that came together to produce a tragedy. The whole point of Deep Learning for each of my senior leaders and me was to be able to see ourselves in the system and what causal patterns we could have been able to break (if we had a better appreciation for the unintended consequences of our many well intended decisions). Key questions: Am I asking the right questions? Am I curious enough?

9 comments:

Anonymous said...

#3 and #8 struck a chord with me. I have seen #3 occur so many times. The front line people are not involved in the decision making. Then, if well-intentioned executives DO involve them, they lack the training in quality/safety type thinking to understand what the executives are after, thus reinforcing the notion that they shouldn't be involved. It's a vicious circle, only broken by constant immersion of all employees, all types (e.g. from bottom to top, volunteers and part timers, too), all shifts, in a true "culture" of patient safety.

#8 is well known in medicine and has been particularly studied by malpractice insurance companies. In my hospital years ago, a fatal case of cerebral malaria ("blackwater fever") in the ED was diagnosable but missed by a train of cognitive and performance errors, including the patient(a frequent traveler to malaria-endemic areas) not telling the physicians he had ceased his antimalarial drugs, our best microbiology lab tech missing the few malarial forms on the blood smear, complacence on the part of the ED physicians in the face of the "negative" smear, etc. - when we analyzed it there were a ton of places where one change would have saved the patient. Truly sobering.
All Tom's points are well taken but illustrate just how treacherous the field of safety is.

nonlocal

Anonymous said...

i have been thinking about your posts last week and today.

as i started to read the globe article i had that sinking feeling- how awful that a serious but atypical event would occur just as you have been making so much progress. but you have handled it with all the thoughtfulness, sensitivity and transparency i would have expected.

but you are correct. the understanding of organization processes needs to go much deeper. real culture is the sum of joint experience and microincentives. too often in both the nuclear and medical worlds "safety culture" is proclaimed. we know that what counts is what is learned and passed on.

the best work i know in this area was done in the 90's, about the front line crews that handle the landings of jets on aircraft carriers. for many technical reasons there is still little margin for error in these landings. that means it never gets routine: every landing is edgy, if everybody is not alert very bad outcomes are possible.

the adaptive response is a team with a flat hierarchy and non stop communication. everybody on the team, regardless of rank, is backstopping each other all the time (well, 99%, nobody is completely perfect). which brings us back to your ortho surgeon who ignored the markings on the patient's body, and skipped the time out. the key question is why nobody else in the room called him on it- it was the team dynamic which is flawed.

this is the opposite of the intervention you told us about last year, where nurses and residents on the wards have been empowered to make the nite call to the attending when they sense trouble. if the press account is accurate, i wonder how much the old surgical authority still lives in ORs....

Anonymous said...

Will the response be as daring and comprehensive as the revelation? Given that many (most?) in community do not know the rate of 'never events,' it is also assumed that this is public because it happened. Now it is rational that they ask for something as big in return.

Anonymous said...

I don't view the revelation as "daring", but to answer your question, a daring response is not likely. The response has to be designed to work in the thousands of cases that occur every year, not to meet a particular definition of daring. It might actually be composed of some things that are pretty mundane, but effective. Comprehensive, though? For sure.

In any event, we will share it internally and externally once we figure it out.

Unknown said...

I've begun looking into the literature about treating healthcare as a complex adaptive system. (CAS's internal relationships are deeply intertwined, dynamic, and incorporate feedback.)

Never events and how to address them seem to lend themselves to this type of analysis. (As does Tom's mental model shifts.)

Is CAS analysis of healthcare still a valid concept? Paul, is this a familiar concept to folks in your position?

Short intro on CAS and Healthcare: Health Care as a Complex Adaptive System: Implications for Design and Management, William B. Rouse.

Anonymous said...

In the blood bank we follow strict protocols to ensure safe blood transfusions for our patients. There are times when not allowing a variance for a "special circumstance" has us patiently listening under pressure to an understandably frustrated anesthesiologist wanting to start surgery or a compassionate nurse dreading to redraw a sick patient -all because we must insist on enforcing these protocols. It's a matter of life and death to a patient receiving a blood transfusion. It just takes one deviation to make a disaster.
I don't know what the dynamic is in surgery, although I know it is a team concept. If so, in theory, team members support each other as they each carry out their individual roles. That makes it everyone's responsibility as a team member to ensure all protocols are met. I understand that the surgeon is in charge, but everyone must insist that protocols are adhered to even it it means feeling uncomfortable. I know how that feels, but the consequences would feel worse.

Anonymous said...

As to #5, here's an interesting article on "best practices".

The key learning? "The 'Best Practice' that is identified is the methodology that led to the solution and not the solution itself. The focus is how did you develop the solution and not what did you develop as the solution."

Anonymous said...

Reviewing the comments by john norris and pam, I am struck by the relevance of both of them - one to the very big picture (john's) and one to the immediate issue (pam's). Pam is quite right - the blood bank may be the most safety-conscious department in any hospital. It rarely, if ever, allows deviation from protocol despite the impassioned protests from clinicians. (We all have stories of clinicians swearing they "remember" whose blood they drew, only to have the resulting blood type not match the patient's.) The blood bank techs are inured to the resulting abuse and, in extreme cases, can refer the call to the M.D. blood bank director for backup. One wonders if the non-M.D. personnel in an OR need this same sort of training/attitude. Although unorthodox, perhaps getting the blood bank professionals together with the non-M.D. OR personnel may transmit some of the necessary rigidity of attitude and intestinal fortitude.
As for john's link, I only understood it superficially, but it now makes sense to me that engineers may actually have much to contribute to attempts to organize the health care "system" of the future that we all would like to see. I recommend reading the link.

nonlocal M.D. (and former blood bank medical director)

Unknown said...

Thanks nonlocal M.D. It seems that the C.A.S that I have looked at is more from Academic/Sociology than engineering. (But it is still very new to me.) As you point out, Pam's note is a good example.

I think Pam sees herself as a gatekeeper at the blood bank. The gatekeeper has a commonly understood role, bordering on the mythological. (I'm thinking of the lowly sentry stopping the general's convoy.) The gatekeeper authority is understood to cross hierarchy.

More than a simple role, requiring training and procedure, being a Gatekeeper is part of the meaning to Pam's time. As humans, we inhabit meaning. Pam's informs her actions both great and small.

Those in the OR will need to have a similar concept for meaning. Procedures can be overlooked, technology worked-around, but the meaning of one's work is part of who one is and is harder to ignore.

What sort of meaning is appropriate for the agents in the O.R? Perhaps it is the duty of a Gatekeeper, or the adherence of a Technician, or maybe the responsibility of Patient Surrogate. Those in the OR will need to have a clear vision of who they are, the meaning of their work.

How can one give/change that sort of meaning to/in others? I am not sure. I think it would require transformational leadership and the creation of a compelling vision.

I'm continuing my studies into C.A.S and hope to learn practical uses. If anyone has any suggestions, feel free to let me know (here or elsewhere.)

Thanks to Paul and the rest-

John