Here is a statement* that Oliver Wendell Holmes, Sr., as dean of Harvard Medical School, gave in an introductory lecture to the medical class on November 6, 1861:
Science is the topography of ignorance. From a few elevated points, we triangulate vast spaces enclosing unknown details. We cast the lead and draw up a little sand from the abyss we may never reach with our dredges.
And from Jules Verne, Journey to the Center of the Earth:
Science, my boy, is composed of errors, but errors that it is right to make, for they lead step by step to the truth.
I think you would be hard-pressed to find recent graduates from medical schools who would not understand these quotes and find them inspirational. After all, medical students are steeped in the scientific method. Those who go on to academic hospitals apply that method in their scientific research.
Then they enter the clinical setting, and many put aside that method. They rely on judgment, memory, expertise, instinct, creativity, and anecdote in treating their patients.
Brent James has put it this way:
We continue to rely on the "craft of medicine," in which each physician practices as an independent expert -- in the face of huge clinical uncertainty (lack of clinical knowledge; rapidly increasing amount of medical knowledge; continued reliance on subjective judgment; and limitations of the expert mind when making complex decisions.)
The scientific method relies on establishing a base case against which hypotheses are tested. The base case often does not exist in the clinical setting because there is a large degree of variation in clinical practice. How can a hospital or group of doctors test new approaches of care delivery for efficacy relative to a base case where the base case does not exist?
"These things happen" is often the result. A certain number of cases of harm to patients are viewed as an irreducible statistical percentage. There is no scientific validation that that number is, in fact, an irreducible number. By anecdote, it becomes the standard of care.
In this way, our finest doctors betray their own training as scientists. Perhaps it is not their fault, in that the medical schools do not explain that the same method that is used in basic science research can be applied to clinical process improvement. As the Lucien Leape Institute notes: "[M]edical schools and teaching hospitals have not trained physicians to follow safe practices, analyze bad outcomes, and work collaboratively in teams to redesign care processes to make them safer.
The "bad outcomes" are the errors that Jules Verne urged us to make and learn from. For two years, the IHI Open School has been taking comments about the wrong-side surgery that took place here at BIDMC and about our decision to broadly publicize that surgery to our entire staff. My colleague Deepa posed the following question: "What do you think of the way the hospital responded to the error?"
I have been watching the replies over the last two years, and I have been pleased by the near unanimity and enthusiasm for the transparency with which we dealt with this issue.
Here are three recent comments:
Disciplining may work in some cultures, but we as a society learn from our mistakes. The culture that is developing may seem new to some people who have been in the profession for many years and have become accustomed to doing things a certain way. To change that view and have them realize that making a mistake and holding their hands up to it is not a punishable act. It is something that we can learn from, and by taking the steps that have been laid out for them with new patient safety protocols they can lower mistakes.
My work culture is changing, but I can remember times when we were afraid to make mistakes because we didn't want to be "the example." Now we are looking into errors from a systems approach and are creating a more transparent culture. We are trying to create an information board . . . which we hope will show staff and guests that safety and quality are our priority and that mistakes do happen. From all the changes we have made so far, I do see a difference in morale.
When an error is shared everyone benefits by knowing what not to do. The patient and patient's family also feels as if the institution was not trying provide a cover-up for the error. This approach allows all to learn valuable lessons, while admitting to the patient that there was an error and trying to make the wrong right.
Dr. Ernest Codman propounded this approach in the early 1900s. An article in 2008 noted:
...A century later, the medical profession is still struggling with the same issues as though they were new. Dr. Codman was right then, and he is right now. Fundamental to the quality movement and American medicine in the 21st century are the same peer review, standardization, systems engineering, and outcome measurement issues. Publishing results for public scrutiny remains a controversial topic. We should embrace transparency as a component of our tipping point strategy to ignite the change we all need to transform our organizations and our profession.
The path is clear: Reduce variation, admit errors, test out new approaches to clinical processes, measure and publish the results. Repeat until done.
P.S. You are never done.
*With thanks to HMS Dean Jeffrey Flier, who reminded me of these quotes in a recent testimonial to one of his predecessors, Dan Tosteson.