In December of 2005, there was a breach of the dam holding back the 50-acre Tom Saulk reservoir in Missouri. This created what was called a mini-tsunami, with 17 foot waves.
The proximate cause of the problem was that a simple water level gauge failed, and so the power company overfilled the dam. There were redundant systems in place to measure the water levels, but they had been bypassed, leaving a single failure mode in place.
The state public service commission investigated the accident and found that the utility's "decision to continue operating Taum Sauk after the discovery of the failure of the gauge piping anchoring system and the consequent unreliability of the piezometers upon which [its] control system was based is frankly beyond imprudent – it is reckless."
The design of safety systems for infrastructure is a science, as it is in hospitals. There is one thing common to both: If you rely on a single point of control to avoid disaster, you are likely to fail. Sometimes catastrophically.
If I am reading that right, it wasn't so much a design failure as human "error" in bypassing the redundancies? This is the more difficult dilemma. Perhaps future designs should somehow incorporate forcing mechanisms which prevent dumb humans from going around them. I don't really know the answer, but it happens repeatedly.
ReplyDeletenonlocal
Paul, I agree that this is an issue and something that needs to be addressed to ensure its outcome is not recreated elsewhere. However, I have to ask, how is this any different than what we see in modern health care? Whether it be due to experience, (new) research, emotion, etc, we override existing protocol and exceed control limits on a daily basis. Though not always the case, these “special cause” situations do often result in negative outcomes (much like what happened with the Tom Saulk).
ReplyDeleteExactly! I'm so pleased you noted the parallel.
ReplyDelete