Wednesday, August 25, 2010

Science is the topography of ignorance

Here is a statement* that Oliver Wendell Holmes, Sr., as dean of Harvard Medical School, gave in an introductory lecture to the medical class on November 6, 1861:

Science is the topography of ignorance. From a few elevated points, we triangulate vast spaces enclosing unknown details. We cast the lead and draw up a little sand from the abyss we may never reach with our dredges.

And from Jules Verne, Journey to the Center of the Earth:

Science, my boy, is composed of errors, but errors that it is right to make, for they lead step by step to the truth.

I think you would be hard-pressed to find recent graduates from medical schools who would not understand these quotes and find them inspirational. After all, medical students are steeped in the scientific method. Those who go on to academic hospitals apply that method in their scientific research.

Then they enter the clinical setting, and many put aside that method. They rely on judgment, memory, expertise, instinct, creativity, and anecdote in treating their patients.

Brent James has put it this way:

We continue to rely on the "craft of medicine," in which each physician practices as an independent expert -- in the face of huge clinical uncertainty (lack of clinical knowledge; rapidly increasing amount of medical knowledge; continued reliance on subjective judgment; and limitations of the expert mind when making complex decisions.)

The scientific method relies on establishing a base case against which hypotheses are tested. The base case often does not exist in the clinical setting because there is a large degree of variation in clinical practice. How can a hospital or group of doctors test new approaches of care delivery for efficacy relative to a base case where the base case does not exist?

"These things happen" is often the result. A certain number of cases of harm to patients are viewed as an irreducible statistical percentage. There is no scientific validation that that number is, in fact, an irreducible number. By anecdote, it becomes the standard of care.

In this way, our finest doctors betray their own training as scientists. Perhaps it is not their fault, in that the medical schools do not explain that the same method that is used in basic science research can be applied to clinical process improvement. As the Lucien Leape Institute notes: "[M]edical schools and teaching hospitals have not trained physicians to follow safe practices, analyze bad outcomes, and work collaboratively in teams to redesign care processes to make them safer.

The "bad outcomes" are the errors that Jules Verne urged us to make and learn from. For two years, the IHI Open School has been taking comments about the wrong-side surgery that took place here at BIDMC and about our decision to broadly publicize that surgery to our entire staff. My colleague Deepa posed the following question: "What do you think of the way the hospital responded to the error?"

I have been watching the replies over the last two years, and I have been pleased by the near unanimity and enthusiasm for the transparency with which we dealt with this issue.

Here are three recent comments:

Disciplining may work in some cultures, but we as a society learn from our mistakes. The culture that is developing may seem new to some people who have been in the profession for many years and have become accustomed to doing things a certain way. To change that view and have them realize that making a mistake and holding their hands up to it is not a punishable act. It is something that we can learn from, and by taking the steps that have been laid out for them with new patient safety protocols they can lower mistakes.


My work culture is changing, but I can remember times when we were afraid to make mistakes because we didn't want to be "the example." Now we are looking into errors from a systems approach and are creating a more transparent culture. We are trying to create an information board . . . which we hope will show staff and guests that safety and quality are our priority and that mistakes do happen. From all the changes we have made so far, I do see a difference in morale.

When an error is shared everyone benefits by knowing what not to do. The patient and patient's family also feels as if the institution was not trying provide a cover-up for the error. This approach allows all to learn valuable lessons, while admitting to the patient that there was an error and trying to make the wrong right.

Dr. Ernest Codman propounded this approach in the early 1900s. An article in 2008 noted:

...A century later, the medical profession is still struggling with the same issues as though they were new. Dr. Codman was right then, and he is right now. Fundamental to the quality movement and American medicine in the 21st century are the same peer review, standardization, systems engineering, and outcome measurement issues. Publishing results for public scrutiny remains a controversial topic. We should embrace transparency as a component of our tipping point strategy to ignite the change we all need to transform our organizations and our profession.

The path is clear: Reduce variation, admit errors, test out new approaches to clinical processes, measure and publish the results. Repeat until done.

P.S. You are never done.

---
*With thanks to HMS Dean Jeffrey Flier, who reminded me of these quotes in a recent testimonial to one of his predecessors, Dan Tosteson.

6 comments:

Unknown said...

It takes a lot of courage to do what your hospital did.
Irfan

Joe from Madison Carpet Cleaning said...

As a struggling med student I do find inspiration in those words. Especially Jules Verne. The truth factor seems so elusive in the current setting.

Anonymous said...

Paul, thank you for pointing out this critical distinction between the tools of science, and those of clinical medicine. Academic clinicians, especially, often ruffle at the suggestion that they are not - de facto - scientists. While they may be researchers in specific areas, care delivery is another enterprise altogether. I would underscore that the scientific approach demands not just individual exposure of error, but an institutional building of capacity to identify and respond to error. Clinicians need a system that directs and demands a cooperative aggregation of testable information, rather than an artisanal approach. Sherlock Holmes is a hero, but real improvement demands a Brent James and widespread participation in an epidemiology of care delivery.

Alex said...

Thank you for your insight!

Joe Hess said...

We encourage transparency, we create data dens, data repositories, data warehouses, where anyone can review our practice. There is so much data to review and look at, how do you decide when enough data is enough data, and more importantly how do you translate that data into information so that the staff can truly understand. Could it be that if the staff understood the information and its importance, then they would truly feel like they could participate fuller, provide information, data, insights and not be made the example..

Anonymous said...

As a regular reader of this blog, I recall taking the IHI Open School courses and at first being startled, but then totally unsurprised, to see BIDMC's wrong site surgery dissected in detail as a case study.

Granted, Paul quotes from other thought leaders, many of them physicians, in this post; but I wonder if anyone else in our profession shares my embarrassment - yes, embarrassment - that someone with zero background in medicine or health care can hold up such a mirror to us and shatter our complacency. To me, it not only calls into question our practices as individual physicians, but challenges physician CEO's of other hospitals as to why they have not had these insights or implemented their logical consequences. Particularly in the medical mecca of Boston.

nonlocal MD