As I am reading the book The Lost Art of Finding Our Way by John Edward Huth, I am struck by a correspondence between the kind of confirmation bias experienced by a physician who has engaged in diagnostic anchoring and a phenomenon called "bending the map" that is experienced when people get lost in the wilderness. Here's the quote from the book:
The correspondence between a mental or physical map and our perceptions helps us stay oriented, but one of the first stages of getting lost involves a process called "bending the map." The phrase comes from the sport of orienteering, in which competitors find their way around a series of waypoints that are revealed to them on a map at the start of the race. Competitors can become lost and believe they are in one place indicated on the map and mentally try to force features they see to line up with ones indicated on the map even when the correspondence is poor.
Denial is an effective psychological defense mechanism, and map bending is one form that lost persons often engage in. A lost person might first believe he is located at a certain point on the map, but things around him do not seem quite right. He pays attention to details that confirm what he already believes to be true, ignoring all evidence to the contrary. A lost person may be looking for a creek that flows south on the map. In his mind he's sure that he has arrived at the creek. It flows east, yet he conveniently ignores this fact and follows it anyway. It can take some time, but there comes a moment when an eerie realization hits him that something is wrong and he doesn't know why.
The parallels to doctors who have settled prematurely on a patient's diagnosis are compelling. Evidence that supports the conclusion is accepted. Contrary evidence is ignored.
Recall the story Jerome Groopman tells in his book How Doctors Think:
One of my patients was a middle-aged woman with seemingly endless complaints whose voice sounded to me like a nail scratching a blackboard. One day she had a new complaint, discomfort in her upper chest. I tried to pin down what caused the discomfort--eating, exercise, coughing--to no avail. Then I ordered routine tests, including a chest x-ray and a cardiogram. Both were normal. In desperation, I prescribed antacids. But her complaint persisted, and I became deaf to it. In essence, I couldn't think in a different way. Several weeks later, I was stat paged to the emergency room. My patient had a dissecting aortic aneurysm, a life-threatening tear of the large artery that carries blood from the heart to the rest of the body. She died. Although an aortic dissection is often fatal even when discovered, I have never forgiven myself for failing to diagnosis it. There was a chance she could have been saved.
I wish I had been taught, and had gained the self-awareness, to realize how emotion can blur a doctor's ability to listen and think. Physicians who dislike their patients regularly cut them off during the recitation of symptoms and fix on a convenient diagnosis and treatment. The doctor becomes increasingly convinced of the truth of his misjudgment, developing a psychological commitment to it. He becomes wedded to his distorted conclusion.
While Jerry focuses here on the situation where dislike of a patient leads to diagnostic anchoring, we now understand that it can apply in many situations, irrespective of the doctor's personal feelings about the patient.
Let's go further, though, and see if the following emotional reactions also apply to physicians. Huth says:
Suddenly, the lost hiker recognizes that his map and his perceptions don't line up. Panic sets in. The emotional centers of the brain send out warning signals, and perceptions get distorted with a fight-or-flight reaction. Massive amounts of adrenaline flood the mind and body. Breathing and heart rate increase. The person refuses to believe that he's lost and runs frantically in a direction that he's sure will lead back to the trail, only to get deeper into trouble. First one possibility, then another races through his overtaxed mind, and yet he cannot gain any certainty.
"Woods shock" is the term for this kind of anxiety attack brought on by the realization that the subject is lost.
I have seen woods shock occur to physicians. I have seen it in clinics and on the floors and ICUs. I have seen it during case reviews, when doctors are describing adverse events and trying to figure out what went wrong. When denial sets in, I have seen the figurative equivalent of "running frantically in a direction that he's sure will lead back to the trail." The flailing and emotion distress that occurs is painful to watch and, I'm sure, to experience for these people who have been trained not to be wrong. Rationalization comes into play. Blame of other parties--the nurses, the labs, the residents--is a common response.
In a post almost three years ago, I summarized a talk by Pat Croskerry from Dalhousie University, Halifax, Nova Scotia. Pat says we need to spend more time teaching clinicians to be more aware of the importance of decision-making as a discipline. He feels we should train people about the various forms of cognitive bias, and also affective bias. Given the extent to which intuitive decision-making will continue to be used, let's recognize that and improve our ability to carry out that approach by improving feedback, imposing circuit breakers, acknowledging the role of emotions, and the like. In summary, let's see if we can protect our clinicians from bending the map and experiencing woods shock.
The correspondence between a mental or physical map and our perceptions helps us stay oriented, but one of the first stages of getting lost involves a process called "bending the map." The phrase comes from the sport of orienteering, in which competitors find their way around a series of waypoints that are revealed to them on a map at the start of the race. Competitors can become lost and believe they are in one place indicated on the map and mentally try to force features they see to line up with ones indicated on the map even when the correspondence is poor.
Denial is an effective psychological defense mechanism, and map bending is one form that lost persons often engage in. A lost person might first believe he is located at a certain point on the map, but things around him do not seem quite right. He pays attention to details that confirm what he already believes to be true, ignoring all evidence to the contrary. A lost person may be looking for a creek that flows south on the map. In his mind he's sure that he has arrived at the creek. It flows east, yet he conveniently ignores this fact and follows it anyway. It can take some time, but there comes a moment when an eerie realization hits him that something is wrong and he doesn't know why.
The parallels to doctors who have settled prematurely on a patient's diagnosis are compelling. Evidence that supports the conclusion is accepted. Contrary evidence is ignored.
Recall the story Jerome Groopman tells in his book How Doctors Think:
One of my patients was a middle-aged woman with seemingly endless complaints whose voice sounded to me like a nail scratching a blackboard. One day she had a new complaint, discomfort in her upper chest. I tried to pin down what caused the discomfort--eating, exercise, coughing--to no avail. Then I ordered routine tests, including a chest x-ray and a cardiogram. Both were normal. In desperation, I prescribed antacids. But her complaint persisted, and I became deaf to it. In essence, I couldn't think in a different way. Several weeks later, I was stat paged to the emergency room. My patient had a dissecting aortic aneurysm, a life-threatening tear of the large artery that carries blood from the heart to the rest of the body. She died. Although an aortic dissection is often fatal even when discovered, I have never forgiven myself for failing to diagnosis it. There was a chance she could have been saved.
I wish I had been taught, and had gained the self-awareness, to realize how emotion can blur a doctor's ability to listen and think. Physicians who dislike their patients regularly cut them off during the recitation of symptoms and fix on a convenient diagnosis and treatment. The doctor becomes increasingly convinced of the truth of his misjudgment, developing a psychological commitment to it. He becomes wedded to his distorted conclusion.
While Jerry focuses here on the situation where dislike of a patient leads to diagnostic anchoring, we now understand that it can apply in many situations, irrespective of the doctor's personal feelings about the patient.
Let's go further, though, and see if the following emotional reactions also apply to physicians. Huth says:
Suddenly, the lost hiker recognizes that his map and his perceptions don't line up. Panic sets in. The emotional centers of the brain send out warning signals, and perceptions get distorted with a fight-or-flight reaction. Massive amounts of adrenaline flood the mind and body. Breathing and heart rate increase. The person refuses to believe that he's lost and runs frantically in a direction that he's sure will lead back to the trail, only to get deeper into trouble. First one possibility, then another races through his overtaxed mind, and yet he cannot gain any certainty.
"Woods shock" is the term for this kind of anxiety attack brought on by the realization that the subject is lost.
I have seen woods shock occur to physicians. I have seen it in clinics and on the floors and ICUs. I have seen it during case reviews, when doctors are describing adverse events and trying to figure out what went wrong. When denial sets in, I have seen the figurative equivalent of "running frantically in a direction that he's sure will lead back to the trail." The flailing and emotion distress that occurs is painful to watch and, I'm sure, to experience for these people who have been trained not to be wrong. Rationalization comes into play. Blame of other parties--the nurses, the labs, the residents--is a common response.
In a post almost three years ago, I summarized a talk by Pat Croskerry from Dalhousie University, Halifax, Nova Scotia. Pat says we need to spend more time teaching clinicians to be more aware of the importance of decision-making as a discipline. He feels we should train people about the various forms of cognitive bias, and also affective bias. Given the extent to which intuitive decision-making will continue to be used, let's recognize that and improve our ability to carry out that approach by improving feedback, imposing circuit breakers, acknowledging the role of emotions, and the like. In summary, let's see if we can protect our clinicians from bending the map and experiencing woods shock.
From Facebook:
ReplyDeleteI have experienced woods shock. A superhuman effort is required to overcome the denial.
From Facebook:
ReplyDeleteIncisive piece, Paul. I have experienced all of the above! That's why I am beginning to think that in order to make the diagnostic process more reliable we somehow need to remove some of the human element in the process. For example lets take something simple like varicose veins. Until the 90's most vascular surgeons assessed them 'clinically' and operated on the basis of their findings. With the advent if Ultrasound duplex scanning we found that the accuracy if clinical examination was only 50%. We might as well of been flipping a coin. Practise changed and now every one has a duplex scan.
Patients present with symptom sets not diagnoses. We need to develop pathways that we follow in certain symptoms sets that lead to high sensitivity and specificity in diagnosis. The challenge then becomes cost as a doctors opinion is cheaper than a duplex scan or a CT as your case study in the article would suggest. The days of the romanticism attributed to great 'clinical acumen' are passé I would say. I argue however that prompt accurate diagnosis is in the long term more cost effective to the system and should justify investment in more near patient high accuracy diagnostic tools driven on clear pathways. In the UK we have a long way to go to achieve this.
The human cognitive condition needs to become common knowledge among all of us who care for people within our healthcare system. But, I don't think that will be nearly enough. With increasing complexity, it is more and more critical to function within diverse teams (including the patient and family) who can help each other avoid bending the map. This was the human factors evolution the aviation industry went through with cockpit (or crew) resource management. Healthcare is much more complex than aviation- there is no reason we should be asking individual human beings (the physician) to make decisions without the input of a diverse team.
ReplyDeleteI agree more training in decision making and biases is a good thing (who would not?) but I don't think it will do much. I recall Daniel Kahneman saying that even though he understands all of the biases and other problems, he STILL makes the same mistakes as everyone else. And the training itself at least for some will lead to overconfidence.
ReplyDeleteThe case you describe seems to point yet again to the need for a software-driven diagnostic system. Machines don't dislike people.