The human lapses that occurred after the computerized ordering system and pill-dispensing robots did their jobs perfectly well is a textbook case of English psychologist James Reason’s “Swiss cheese model” of error. Reason’s model holds that all complex organizations harbor many “latent errors,” unsafe conditions that are, in essence, mistakes waiting to happen. They’re like a forest carpeted with dry underbrush, just waiting for a match or a lightning strike.

Still, there are legions of errors every day in complex organizations that don’t lead to major accidents. Why? Reason found that these organizations have built-in protections that block glitches from causing nuclear meltdowns, or plane crashes, or train derailments. Unfortunately, all these protective layers have holes, which he likened to the holes in slices of Swiss cheese.

On most days, errors are caught in time, much as you remember to grab your house keys right before you lock yourself out. Those errors that evade the first layer of protection are caught by the second. Or the third. When a terrible “organizational accident” occurs — say, a space shuttle crash or a September 11–like intelligence breakdown — post hoc analysis virtually always reveals that the root cause was the failure of multiple layers, a grim yet perfect alignment of the holes in the metaphorical slices of Swiss cheese. Reason’s model reminds us that most errors are caused by good, competent people who are trying to do the right thing, and that bolstering the system — shrinking the holes in the Swiss cheese or adding overlapping layers — is generally far more productive than trying to purge the system of human error, an impossibility.

Dr. Bob Wachter writing in Backchannel about the errors that led a young patient to receive a massive overdose of antibiotics at one of the nation’s best hospitals. The above excerpt is from the third installment of a multi-part series called “The Overdose.” The series has been looking at the nature of error in tech-driven medicine.

Read the story