Sunday, February 05, 2012

Flying blind in hospitals

A recent article in Bloomberg Businessweek reminded me about the Commercial Aviation Safety Team, coming out of a commission established by President Clinton, who charged it to come up with a process for reducing the rate of airline accidents by 80% over ten years.  The result:

By 2008 CAST was able to report that by implementing the most promising safety enhancements, the fatality rate of commercial air travel in the United States was reduced by 83 percent. 

What's the current goal?

Reduce the U.S. commercial aviation fatality risk by at least 50 percent from 2010 to 2025 and continue to work with our international partners to reduce fatality risk in world-wide commercial aviation.

The methodology is pretty straightforward:  Analyze data; propose process or equipment improvements; put in place action plans and measure their success.  But don't only analyze data about actual accidents.  Near misses and observations are brought into play:

CAST identifies precursors and contributing factors to ensure that resources address the most prevalent categories of accidents.

But to analyze data, you have to set up a system that allows it to be collected in a nonpunitive fashion.  CAST uses the Aviation Safety Information Analysis and Sharing (ASIAS) system, which connects 46 safety databases across the industry.

There are currently 40 member airlines participating in ASIAS. The program has evolved to the point that ASIAS now has access to Flight Operations Quality Assurance (FOQA) programs from 21 operators and Aviation Safety Action Partnership (ASAP) data from flightcrews, maintenance and other employees from 37 operators. ASIAS has begun accessing reports in the Air Traffic Safety Action Program (ATSAP), which provides air traffic controllers with a way to report potential safety hazards. Other Air Traffic Organization (ATO) employees will be added to the program in the future.

In other words, virtually anyone who sets foot in an airplane, touches it, or monitors its travel is expected and empowered to submit a report about potential safety hazards.

And here is a powerful statement:  "Safety improvements are made not only through FAA regulations, but also through CAST."

Compared to this approach, American hospitals are flying blind.  Let's start with what's missing:

The adverse event reporting system in many hospitals is inadequate, both in design and use.  It is inadequate in design because it is not convenient and easy to use.  It is inadequate in use because a blame mentality is often associated with errors, pushing them underground.

It is also inadequate because it does not capture near misses, often the best indication of systemic problems in hospitals, which outnumber adverse events by two or three orders of magnitude.

Within and across hospitals, standardization of clinical processes is often missing.  Think back to the concept of shared baselines described by Brent James.  Remember that concept?

1 -- Select a high priority clinical process;
2 -- Create evidence-based best practice guidelines;
3 -- Build the guidelines into the flow of clinical work;
4 -- Use the guidelines as a shared baseline, with doctors free to vary them based on individual patient needs;
5 -- Meanwhile, learn from and (over time) eliminate variation arising from the professionals, while retain variation arising from patients.

Consider your own hospital and think about how and if this approach is used, even in its main areas of specialization.  If you are honest, you will admit that it is sorely lacking.  If standardization is lacking in one hospital, it certainly cannot be present across a number of them.  Without standardization, process improvement is problematic.  After all, how do you test a hypothesis about an improvement to your clinical process if you do not have a baseline against which to compare the idea you are testing?

The regulatory regime for hospital quality and safety is schizophrenic.  The Joint Commission does its best to create accreditation standards that are thoughtful and up to date; but the regulatory agency with ultimate clout, CMS, relies on Conditions of Participation that are out-of-date, picayune, and often counterproductive.  Beyond its formal regulations, CMS will sometimes issue "subregulatory guidances" that are based on internal bureaucratic theories detached from clinical needs and process improvement principles.  A hospital can receive a visit from well qualified JC surveyors and pass its accreditation with a spectacular score and a few weeks later find itself visited by dozens of bureaucratic CMS surveyors who deem it to be out of compliance with the CoP regulations or guidances.  Whipsawed by a poor regulatory framework, doctors and administrators focus time, money, and personnel on the "must do" rather than on the "should do."

Professional training is inadequate.  Doctors are trained to do some things very well, but medical schools still fail to teach the application of the scientific method to clinical process improvement.  This field is viewed as academically uninteresting by the powers that be.  As noted by the Lucien Leape Institute:

“Medical schools are not doing an adequate job of facilitating student understanding of basic knowledge and the development of skills required for the provision of safe patient care.”

“The medical education system is producing square pegs for the delivery system’s round holes,” said Dennis S. O’Leary, MD, President Emeritus of The Joint Commission, a member of the Institute, and leader of the initiative. “Educational strategies need to be redesigned to emphasize development of the skills, attitudes, and behaviors that are foundational to the provision of safe care.”  

President Clinton appointed his commission after TWA Flight 800 exploded over the Atlantic Ocean in July 1996, killing 230 people.  More people than that are killed every day in American hospitals.  I am not talking about people who die from their diseases.  I am talking about people who are killed by preventable harm.

Back to airplane pilots, recall what Captain Sullenberger says,
"I wish we were less patient. We are choosing every day we go to work how many lives should be lost in this country."

He remarked on the scattered application of systemic approaches to safety in the health care industry: "We have islands of excellence in a sea of systemic failures. We need to teach all practitioners the science of safety."

3 comments:

  1. Amen and Amen. You summed it up beatifully and hit the nail on the head. Having spent many years as a Total Quality Management trainer and consultant in industry I can atest to everyting you said. I just want to emphasize the importance of the "culture" shift that is required. Until the environment to report errors and problems that can lead to errors is fixed it will be hard to make the rest happen. Jessica Scott, MD, JD from the IACT Program in NC is trying to help make that culture shift with her program. Check out the organization at http://iactprogram.com. Would like to know what you think.

    ReplyDelete
  2. There is a critical under-utilized, under-analyzed and under-valued system of call-outs in hospital care: patient reporting. Also known as 'patient complaints.' The name itself speaks volumes (what attention would a 'doctor complaint' system get?), and it is relegated to customer service departments, which in most industries get the short-end of status.

    Given the resource constraints of many organizations, wholesale replacement of adverse reporting systems is unlikely. And therefore, yet another excuse for inaction, poor data quality, and that 1% improvement rate (what is the margin of error?) Why not aggressively pursue every line of data that already exists? Developing a body of evidence of errors, near misses, and preventable harm not yet pursued might be just the case for finally investing in improved comprehensive reporting systems.

    ReplyDelete
  3. I am reading a book about an aircraft carrier in WWII, as told by crew members. One chapter that struck me had to do with a particular torpedo launched from a bomber, which had problems tracking through the water in a straight line, due to a design problem. The amazing thing was that after every bombing run, the bomber crew was required to report back to the crew member in charge of torpedos how each of the launched torpedoes had tracked after release. Did it go in a circle? (not uncommon, unfortunately). Did it run straight for awhile and then off track? Statistics were kept and tweaks were made onsite by the involved staff.

    Just think if a report was required every time a piece of equipment was used or a procedure was done in a hospital, rather than relying on someone having the thought, time and interest to (maybe) report what THEY considered an error. The implications are obvious. We need to start entirely reframing how we think about these things.

    nonlocal

    ReplyDelete