Saturday, March 19, 2011

Probably right, or wrong

In the post below, I ask you to make a diagnosis of a medical condition. Most people get it wrong, probably because the actual diagnosis is far removed from the setting presented. People apply their inductive forces to a new problem, based on probabilistic inferences from other situations with which they are more familiar.

I attended a seminar on Friday at which MIT's Joshua Tenenbaum presented a theoretical basis for this learning process. If you subscribe to Science Magazine, you can read his recent article on the topic: "How to Grow a Mind: Statistics, Structure, and Abstraction."

It turns out that people are reasonably good at inference, from a very young age, as Joshua notes:

Generalization from sparse data is central in learning many aspects of language, such as syntactic constructions or morphological rules. It presents most starkly in causal learning: every statistics class teaches that correlation does not imply causation, yet children routinely infer causal links from just a handful of events, far too small a sample to compute even a reliable correlation!

In a more theoretical section, the author describes a probabilistic, or Baysian, model to explain this learning process:

How does abstract knowledge guide inference from incomplete data? Abstract knowledge is encoded in a probabilistic generative model, a kind of mental model that describes the causal processes in the world giving rise to the learner's observations as well as unobserved or latent variables that support effective prediction and action if the learner can infer their hidden state. . . . A generative model . . . describes not only the specific situation at hand, but also a broader class of situations over which learning should generalize, and it captures in parsimonious form the essential world structure that causes learners' observations and makes generalizations possible.

Except when it doesn't work! As several of you demonstrated below, that same probabilistic model can lead to cognitive errors.

I summarized Pat Croskerry's explanation below:

Croskerry's exposition compares intuitive versus rational (or analytic) decision-making. Intuitive decision-making is used more often. It is fast, compelling, requires minimal cognitive effort, addictive, and mainly serves us well. It can also be catastrophic in that it leads to diagnostic anchoring that is not based on true underlying factors.

Why the dichotomy? How can a learning process that works so well in some cases led us awry in others? I asked Joshua, and he suggested that it might have to do with the complexity of the issue. For those functions that were important in an evolutionary sense as humans evolved -- e.g., recognizing existential threats, sensing the difference between poisonous and healthy plants -- a quick probabilistic inference was all that mattered.

Now, though, in a complex society, perhaps we get trapped by our inferences. The sense of tribalism that led us to flee from -- or fight -- people who looked different and who might have been seeking to steal our territory or food becomes evident now as unsupported and destructive racial or ethnic prejudice.

Likewise, the diagnostic approach to illness or injury that might have sufficed with simple health threats 10,000 years ago no longer produces the right result in a more complex clinical setting. Think about it. If you were a shaman or healer in a tribe, most conditions or illnesses healed themselves. You recognized the common ailments, and you knew you didn't need to do much, and whatever herbs or amulets or incense you used did no harm. If you couldn't cure the disease, you blamed the evil spirits.

In contrast, as a doctor today, you are expected to apply an encyclopedic knowledge to a variety of complex medical conditions -- cancer, cardiovascular disease, liver and kidney failure -- that were relatively unknown back then. (You were more likely to die from something more simple at a much younger age!) Many cases you see today have a variety of symptoms and multivariate causes and different possible diagnoses. It is no surprise that your mind tries to apply -- in parsimonious form -- a solution. The likelihood of diagnostic anchoring is actually quite high, unless you take care. As I note below:

Croskerry thinks we need to spend more time teaching clinicians to be more aware of the importance of decision-making as a discipline. He feels we should train people about the various forms of cognitive bias, and also affective bias. Given the extent to which intuitive decision-making will continue to be used, let's recognize that and improve our ability to carry out that approach by improving feedback, imposing circuit breakers, acknowledging the role of emotions, and the like.

6 comments:

Anonymous said...

I haven't read much about this so the answer may be readily evident, but I wonder why computer-assisted diagnostic decision support hasn't caught on better in medicine. It would seem the computer could be used to provide a complete compendium of diagnostic possibilities to prevent at least some of such cognitive errors.

I have heard,in practice, the computer produces pages of irrelevant diagnoses that exhaust the clinician's patience very quickly. Does anyone have any experience with this?

nonlocal

Sharon said...

"When you hear the beating of hooves, don't assume it is horses. Look for the zebras". A lesson learned early in my medical career.

Margo Corbett said...

This is the first time I've heard that diagnostic errors are actually being discussed. My husband and I have each survived a life-threatening diagnostic error. Some states are requiring reporting of certain types of hospital errors. Is anyone counting diagnostic errors?

I spend a great deal of time educating patients and caregivers on the importance of their doctor appointments. Most people take appointments too lightly and don't prepare well for them. They don't consciously realize that live & death decisions are being made FOR them.

When I ask the audience how many people tell the doctor what is wrong, hear his answer and then leave with an order for a test or a prescription - most of the people raise their hand. When I ask how many ask questions, seek alternatives/options and make a joint decision about what to do going forward - very few if any raise their hand. I find this very unsettling.

I viewed Crosberry's session on intuitive vs cognitive decision making. I now know why my favorite doctors are the ones who think out loud as they are processing my situation. They are analyzing instead of going with their gut. I love it because it helps me know what other information to share that is helpful to him and spurs more questions. I leave those appointments feeling I have good understanding of my situation and have been part of the decision making, which leads to more trust and confidence in the doctor and my care.

Many barriers to communication could be broken down if this one technique of thinking out loud was used by more doctors.

Elaine Schattner, MD said...

All the more reason that doctors need time to read and think.

Unknown said...

’Medicine used to be simple, ineffective and relatively safe. It is now complex, effective and potentially dangerous’.

Sir Cyril Chantler

Art Papier MD said...

Would you prefer that your pilot searches Google for maps or directions on approach, rather than using the in-cockpit systems specifically designed to guide flight? The challenge for those of us working in the field of diagnostic errors and point of care information is to convince physicians and healthcare CEO's that investments in the right information at the time of decision making is critical. We are currently focused on documentation, rather than tools to aid perception and cognition. The patient safety and quality movement is focused on medication and surgical mistakes, while diagnostic errors outpace those mistakes by a factor of 3.

There is a rich history of clinical diagnostic decision support in medicine with a few very capable systems developed over the past several decades. The problem has been that physicians have not used them.

While VisualDx does not cover every symptom or sign in medicine, the system handles a longstanding need for primary care and ED physicians, pattern recognition. Primary care and ED physicians readily admit being under-trained in visual recognition. VisualDx helps generalists recognize patterns they have never seen, and to recognize variant presentations of common diseases.

A pattern recognition approach to differential diagnosis gives the decision maker objective clues, which is often more meaningful than a list of diagnostic probabilities. VisualDx is used in more than 1300 hospitals and clinics, and recently the entire VA medical system licensed VisualDx.

I mentioned in prior comments a recent study on soft tissue infection diagnostic error, where cellulitis is over-diagnosed and prematurely closed on. http://anagen.ucdavis.edu/1703/1_originals/1_10-00308/article.html

Cellulitis is an extremely common problem. Patients are put at risk for being unnecessarily hospitalized for cellulitis, yet the problem has existed for decades. I suggested in my previous comment that doctors are enamored with the rare and esoteric stories, and fail to focus on fixing the old and repeated.

Just as we need patients to check on how well their physicians are listening, patients should also expect their physicians to use professional information systems.

Full disclosure: I am the CMIO of Logcial Images the company that developed VisualDx for professionals (www.visualdx.com)

We also have a free site to empower patients with a rash or skin problem. See
http://www.skinsight.com/skinConditionFinder.htm

Art Papier MD