Goldovsky’s experiment yielded a key insight into human error: not only had the experts misread the music — they had misread it in the same way. In a subsequent study, Goldovsky’s nephew, Thomas Wolf, discovered that good sight readers report that they do not read music note by note; instead, they rely on their recognition of familiar patterns and on their ability to organize the music into those patterns and dependable cues.
One day, a student of his was practicing a piece by Brahms when [Boris] Goldovsky heard something wrong. He stopped her and told her to fix her mistake. The student looked confused; she said she had played the notes as they were written. Goldovsky looked at the music and, to his surprise, the girl had indeed played the printed notes correctly — but there was an apparent misprint in the music.
At first, the student and the teacher thought this misprint was confined to their edition of the sheet music alone. But further checking revealed that all other editions contained the same incorrect note. Why, wondered Goldovsky, had no one — the composer, the publisher, the proofreader, scores of accomplished pianists — noticed the error? How could so many experts have missed something that was so obvious to a novice?
. . . In short, they don’t read; they infer. Moreover, this trait is not unique to musicians: pattern recognition is a hallmark of expertise in any number of fields; it is what allows experts to do quickly what amateurs do slowly.
Goldovsky’s insight offers a useful metaphor for understanding the crisis on Wall Street: Not only did hedge-fund managers, bankers and others misread the danger involved in many of their investments, but they misread them in the same way.
The author's conclusion: "These types of errors are most likely to be discovered by those who, like Goldovsky’s young student, look at the world with new, unblinking eyes."In an earlier post, I discussed forms of cognitive bias that have been documented by psychologists and neurologists. Also, you may recall this presentation by Pat Croskerry about intuitive decision-making, and the "cognitive miser function," a tendency to get comfortable with the form of decision-making that you find most used and useful.
The application of these thoughts to process improvement in the health care setting is obvious. It is very difficult to overcome cognitive biases in the delivery of clinical care, especially when the field remains such a cottage industry, in which in which each person is expected to be an artist, relying on his or her creativity, intuition, and experience when taking care of a patient. The resulting lack of standardization -- the high degree of practice variation -- creates an environment that is inimical to process improvement based on scientific methods.
But I wonder if the some of the same types of biases are spilling over into the business aspects of hospital finance, too. In Hallinan's words, "It may be too much to suggest that we let adolescents run Wall Street.... But it wouldn’t hurt to let them check the math." Maybe it is time for some "new, unblinking eyes" among business reporters.
5 comments:
I am reminded of my surprise and occasional outrage when I first started reading your blog and, with that non-health care background, you used your unblinking eye on us. Particularly with the transparency initiative, utter anathema to us 'insiders.'
So I am listening now, and couldn't agree more. Now that business people run health care, we have added to the innate secrecy of medicine, the business bandwagon attitudes of the next "greatest thing since sliced bread" investment - even if the bread is stale with bologna between the slices.
nonlocal MD
What a great analogy. I am new reading your posts and so far, so good.
God bless the new patients.
Ha ha, "the bread is stale and full of bologna" - excellent!
Seriously, though, I think you're onto something with the cognitive thing. It's another view of "you don't know what you don't know.
The reason I like the cognitive twist is that there are apparently layers of this - it's not a matter of being a sloppy thinker. I can't find it now but in some book I saw a great blind-spot test. Those tests usually have you focus on one dot, until another disappears. This was the opposite: it was a grid of lines, with a hole (not a dot) in the middle. When you focused on the other target, *the hole disappeared* - the brain filled in the lines that weren't there!
No overconfident personality can do that - it was happening at some post-cognition neurological processing level.
All the more reason for us to adopt a collaborate-not-blame culture, to help each other do better.
Dave, I didn't make that up about the bread and bologna; it was a quote from a private equity official as noted in Paul's referenced post:
" At a recent conference, one private equity official derisively talked about the inadequacies of local lay leaders eating their "stale bologna sandwiches" at Board of Trustees meetings, to draw a contrast with the unsentimental businesslike behavior of a board chosen by his firm."
The irony struck me of a reference to stale bread and bologna in a conversation about an equity bubble.
nonlocal
Cognitive bias incubates in an electronic healthcare environment...just because you CAN be more efficient...
Post a Comment