This article by Lisa Zyga summarizes a forthcoming Lachlan J. Gunn, et al article in Proceedings of The Royal Society A, "Too good to be true: when overwhelming evidence fails to convince." It offers some interesting thoughts. This is a useful discussion, especially for those in leadership positions. See, especially, #5 below.
Excerpts from the article:
Under ancient Jewish law, if a suspect on trial was unanimously found guilty by all judges, then the suspect was acquitted. This reasoning sounds counterintuitive, but the legislators of the time had noticed that unanimous agreement often indicates the presence of systemic error in the judicial process, even if the exact nature of the error is yet to be discovered. They intuitively reasoned that when something seems too good to be true, most likely a mistake was made.
The researchers demonstrated the paradox in the case of a modern-day police line-up, in which witnesses try to identify the suspect out of a line-up of several people. The researchers showed that, as the group of unanimously agreeing witnesses increases, the chance of them being correct decreases until it is no better than a random guess.
In police line-ups, the systemic error may be any kind of bias, such as how the line-up is presented to the witnesses or a personal bias held by the witnesses themselves. Importantly, the researchers showed that even a tiny bit of bias can have a very large impact on the results overall.
The paradox of unanimity may be counterintuitive, but the researchers explain that it makes sense once we have complete information at our disposal.
"As with most 'paradoxes,' it is not that our intuition is necessarily bad, but that our intuition has been badly informed," Abbott said.
Other areas where the paradox of unanimity emerges are numerous and diverse.
1) The recent Volkswagen scandal is a good example. The company fraudulently programmed a computer chip to run the engine in a mode that minimized diesel fuel emissions during emission tests. But in reality, the emissions did not meet standards when the cars were running on the road. The low emissions were too consistent and 'too good to be true.' The emissions team that outed Volkswagen initially got suspicious when they found that emissions were almost at the same level whether a car was new or five years old! The consistency betrayed the systemic bias introduced by the nefarious computer chip.
2) A famous case where overwhelming evidence was 'too good to be true' occurred in the 1993-2008 period. Police in Europe found the same female DNA in about 15 crime scenes across France, Germany, and Austria. This mysterious killer was dubbed the Phantom of Heilbronn and the police never found her. The DNA evidence was consistent and overwhelming, yet it was wrong. It turned out to be a systemic error. The cotton swabs used to collect the DNA samples were accidentally contaminated, by the same lady, in the factory that made the swabs.
3) When a government wins an election, one laments that the party of one's choice often wins with a relatively small margin. We often wish for our favored political party to win with unanimous votes. However, should that ever happen we would be led to suspect a systemic bias caused by vote rigging.
4) In science, theory and experiment go hand in hand and must support each other. In every experiment there is always 'noise,' and we must therefore expect some error. If results are too clean and do not contain expected noise and outliers, then we can be led to suspect a form of confirmation bias introduced by an experimenter who cherry-picks the data.
5) In many committee meetings, in today's big organizations, there is a trend towards the idea that decisions must be unanimous. For example, a committee that ranks job applicants or evaluates key performance indicators (KPIs) often will argue until everyone in the room is in agreement. If one or two members are in disagreement, there is a tendency for the rest of the committee to win them over before moving on. A take-home message of our analysis is that the dissenting voice should be welcomed. A wise committee should accept that difference of opinion and simply record there was a disagreement. The recording of the disagreement is not a negative, but a positive that demonstrates that a systemic bias is less likely.
Excerpts from the article:
Under ancient Jewish law, if a suspect on trial was unanimously found guilty by all judges, then the suspect was acquitted. This reasoning sounds counterintuitive, but the legislators of the time had noticed that unanimous agreement often indicates the presence of systemic error in the judicial process, even if the exact nature of the error is yet to be discovered. They intuitively reasoned that when something seems too good to be true, most likely a mistake was made.
The researchers demonstrated the paradox in the case of a modern-day police line-up, in which witnesses try to identify the suspect out of a line-up of several people. The researchers showed that, as the group of unanimously agreeing witnesses increases, the chance of them being correct decreases until it is no better than a random guess.
In police line-ups, the systemic error may be any kind of bias, such as how the line-up is presented to the witnesses or a personal bias held by the witnesses themselves. Importantly, the researchers showed that even a tiny bit of bias can have a very large impact on the results overall.
The paradox of unanimity may be counterintuitive, but the researchers explain that it makes sense once we have complete information at our disposal.
"As with most 'paradoxes,' it is not that our intuition is necessarily bad, but that our intuition has been badly informed," Abbott said.
Other areas where the paradox of unanimity emerges are numerous and diverse.
1) The recent Volkswagen scandal is a good example. The company fraudulently programmed a computer chip to run the engine in a mode that minimized diesel fuel emissions during emission tests. But in reality, the emissions did not meet standards when the cars were running on the road. The low emissions were too consistent and 'too good to be true.' The emissions team that outed Volkswagen initially got suspicious when they found that emissions were almost at the same level whether a car was new or five years old! The consistency betrayed the systemic bias introduced by the nefarious computer chip.
2) A famous case where overwhelming evidence was 'too good to be true' occurred in the 1993-2008 period. Police in Europe found the same female DNA in about 15 crime scenes across France, Germany, and Austria. This mysterious killer was dubbed the Phantom of Heilbronn and the police never found her. The DNA evidence was consistent and overwhelming, yet it was wrong. It turned out to be a systemic error. The cotton swabs used to collect the DNA samples were accidentally contaminated, by the same lady, in the factory that made the swabs.
3) When a government wins an election, one laments that the party of one's choice often wins with a relatively small margin. We often wish for our favored political party to win with unanimous votes. However, should that ever happen we would be led to suspect a systemic bias caused by vote rigging.
4) In science, theory and experiment go hand in hand and must support each other. In every experiment there is always 'noise,' and we must therefore expect some error. If results are too clean and do not contain expected noise and outliers, then we can be led to suspect a form of confirmation bias introduced by an experimenter who cherry-picks the data.
5) In many committee meetings, in today's big organizations, there is a trend towards the idea that decisions must be unanimous. For example, a committee that ranks job applicants or evaluates key performance indicators (KPIs) often will argue until everyone in the room is in agreement. If one or two members are in disagreement, there is a tendency for the rest of the committee to win them over before moving on. A take-home message of our analysis is that the dissenting voice should be welcomed. A wise committee should accept that difference of opinion and simply record there was a disagreement. The recording of the disagreement is not a negative, but a positive that demonstrates that a systemic bias is less likely.
3 comments:
Try telling that to homeowner associations. :)
Another prominent example is Harry Markopolos and his early identification of the Bernie Madoff Ponzi scheme. Here is a quote from Wikipedia's entry on Markopolos:
When Markopolos obtained a copy of Madoff's revenue stream, he spotted problems right away. To his mind, Madoff's strategy was so poorly designed that there was no way it could make money. The biggest red flag, however, was that the return stream rose steadily with only a few downticks—-represented graphically by a nearly perfect 45-degree angle. According to Markopolos, anyone who understood the underlying math of the markets would have known they were too volatile even in the best conditions for this to be possible. As he later put it, a return stream like the one Madoff claimed to generate "simply doesn't exist in finance." He eventually concluded that there was no legal way for Madoff to deliver his purported returns using the strategies he claimed to use. As he saw it, there were only two ways to explain the figures—-either Madoff was running a Ponzi scheme (by paying established clients with newer clients' money) or front running (buying stock for his own account, based on knowledge about his clients' orders).[14] Markopolos later said that he knew within five minutes that Madoff's numbers didn't add up. It took him another four hours to mathematically prove that they could have only been obtained by fraud.[15][16]
Despite this, Markopolos' bosses at Rampart asked Markopolos to deconstruct Madoff's strategy to see if he could replicate it. Again and again, he could not simulate Madoff's returns, using information he had gathered about Madoff's trades in stocks and options. For instance, he discovered that for Madoff's strategy to work, he would have had to buy more options on the Chicago Board Options Exchange than actually existed.[15] His calculations of Madoff's trades revealed that there was almost no correlation between Madoff's stocks and the S&P 100, as Madoff claimed. Markopolos also couldn't find any evidence the market was responding to any Madoff trades, even though by his estimate Madoff was managing as much as $6 billion—-three times more than any known hedge fund even then. In Markopolos' mind, these factors suggested that Madoff wasn't even trading.[14]
The same holds for the stock market. The most famous Barron's headline was just before the 1974 stock market decline in their January survey of market gurus: "Not A Bear Among Them!" It's called contrary opinion.
Post a Comment