Goldovsky’s experiment yielded a key insight into human error: not only had the experts misread the music — they had misread it in the same way. In a subsequent study, Goldovsky’s nephew, Thomas Wolf, discovered that good sight readers report that they do not read music note by note; instead, they rely on their recognition of familiar patterns and on their ability to organize the music into those patterns and dependable cues.
One day, a student of his was practicing a piece by Brahms when [Boris] Goldovsky heard something wrong. He stopped her and told her to fix her mistake. The student looked confused; she said she had played the notes as they were written. Goldovsky looked at the music and, to his surprise, the girl had indeed played the printed notes correctly — but there was an apparent misprint in the music.
At first, the student and the teacher thought this misprint was confined to their edition of the sheet music alone. But further checking revealed that all other editions contained the same incorrect note. Why, wondered Goldovsky, had no one — the composer, the publisher, the proofreader, scores of accomplished pianists — noticed the error? How could so many experts have missed something that was so obvious to a novice?
. . . In short, they don’t read; they infer. Moreover, this trait is not unique to musicians: pattern recognition is a hallmark of expertise in any number of fields; it is what allows experts to do quickly what amateurs do slowly.
Goldovsky’s insight offers a useful metaphor for understanding the crisis on Wall Street: Not only did hedge-fund managers, bankers and others misread the danger involved in many of their investments, but they misread them in the same way.The author's conclusion: "These types of errors are most likely to be discovered by those who, like Goldovsky’s young student, look at the world with new, unblinking eyes."
In an earlier post, I discussed forms of cognitive bias that have been documented by psychologists and neurologists. Also, you may recall this presentation by Pat Croskerry about intuitive decision-making, and the "cognitive miser function," a tendency to get comfortable with the form of decision-making that you find most used and useful.
The application of these thoughts to process improvement in the health care setting is obvious. It is very difficult to overcome cognitive biases in the delivery of clinical care, especially when the field remains such a cottage industry, in which in which each person is expected to be an artist, relying on his or her creativity, intuition, and experience when taking care of a patient. The resulting lack of standardization -- the high degree of practice variation -- creates an environment that is inimical to process improvement based on scientific methods.
But I wonder if the some of the same types of biases are spilling over into the business aspects of hospital finance, too. In Hallinan's words, "It may be too much to suggest that we let adolescents run Wall Street.... But it wouldn’t hurt to let them check the math." Maybe it is time for some "new, unblinking eyes" among business reporters.