James Reason, Who Used Swiss Cheese to Explain Human Error, Dies at 86


The story of how James Reason became an authority on the psychology of human error begins with a teapot.

It was the early 1970s. He was a professor at the University of Leicester, in England, studying motion sickness, a process that involved spinning his subjects round and round, and occasionally revealing what they had eaten for breakfast.

One afternoon, as he was boiling water in his kitchen to make tea, his cat, a brown Burmese named Rusky, sauntered in meowing for food. “I opened a tin of cat food,” he later recalled, “dug in a spoon and dolloped a large spoonful of cat food into the teapot.”

After swearing at Rusky, Professor Reason berated himself: How could he have done something so stupid?

The question seemed more intellectually engaging than making people dizzy, so he ditched motion sickness to study why humans make mistakes, particularly in high-risk settings.

By analyzing hundreds of accidents in aviation, railway travel, medicine and nuclear power, Professor Reason concluded that human errors were usually the byproduct of circumstances — in his case, the cat food was stored near the tea leaves, and the cat had walked in just as he was boiling water — rather than being caused by careless or malicious behavior.

That was how he arrived at his Swiss cheese model of failure, a metaphor for analyzing and preventing accidents that envisions situations in which multiple vulnerabilities in safety measures — the holes in the cheese — align to create a recipe for tragedy.

“Some scholars play a critical role in founding a whole field of study: Sigmund Freud, in psychology. Noam Chomsky, in linguistics. Albert Einstein, in modern physics,” Robert L. Sumwalt, the former chairman of National Transportation Safety Board, wrote in a 2018 blog post. “In the field of safety, Dr. James Reason has played such a role.”

Professor Reason died on Feb. 5 in Slough, a town about 20 miles west of London. He was 86.

His death, in a hospital, was caused by pneumonia, his family said.

A gifted storyteller, Professor Reason found vivid and witty ways to explain complicated ideas. At conferences, on TV news programs and in consultation with government safety officials around the world, he would sometimes deploy slices of cheese as props.

In one instructional video, he sat at his dining room table, which was set for a romantic dinner, with a bottle of wine, two glasses and a cutting board layered with cheese.

“In an ideal world, each defense would look like this,” he said, holding up a slice of cheese without holes. “It would be solid and intact.”

Then he reached for another slice, one with quarter-size cutouts. “But in reality, each defense is like this,” he said. “It has holes in it.”

The metaphor was easy to understand.

“All defenses have holes in them,” Professor Reason continued. “Every now and again, the holes line up so that there can be some trajectory of accident opportunity.”

To explain how the holes develop, he put them in two categories: active failures, or mistakes typically made by people who, for example, grab the cat food instead of the tea leaves; and latent conditions, or mistakes made in construction, written instructions or system design, like storing two scoopable substances near each other in a cabinet.

“Nearly all organizational accidents involve a complex interaction between these two sets of factors,” he wrote in his autobiography, “A Life in Error: From Little Slips to Big Disasters” (2013).

In the Chernobyl nuclear accident, he identified latent conditions that had been in existence for years: a poorly designed reactor; organizational mismanagement; and inadequate training procedures and supervision for frontline operators, who triggered the catastrophic explosion by making the error of turning off several safety systems at once.

“Rather than being the main instigators of an accident, operators tend to be the inheritors of system defects,” he wrote in “Human Error” (1990). “Their part is that of adding the final garnish to a lethal brew whose ingredients have already been long in the cooking.”

Professor Reason’s model has been widely used in health care.

“When I was in medical school, an error meant you screwed up, and you should just try harder to not screw up,” Robert Wachter, the chairman of the department of medicine at the University of California San Francisco, said in an interview. “And if it was really bad, you would probably get sued.”

In 1998, a doctor he had recently hired for a fellowship said he wanted to specialize in patient-safety strategy, to which Dr. Wachter replied, “What’s that?” There were no formal systems or methods in his hospital (or most others) to analyze and prevent errors, but there was plenty of blame to go around, most of it aimed at doctors and nurses.

This particular doctor had trained at Harvard Medical School, where they were incorporating Professor Reason’s ideas into patient-safety programs. Dr. Wachter, who began reading Professor Reason’s journal articles and books, said the Swiss cheese model was “an epiphany,” almost “like putting on a new pair of glasses.”

Someone given the wrong dose of medicine, he realized, could have been the victim of poor syringe design rather than a careless nurse. Another patient could have died of cardiac arrest because a defibrillator that was usually stored in the hallway had been taken to a different floor to replace one that had malfunctioned — and there was no system to alert anyone that it had been moved.

“When an error happens, our instinct can’t be to look at this at the final stage,” Dr. Wachter said, “but to look at the entirety of the system.”

When you do, he added, you realize that “these layers of protection are pretty porous in ways that you just didn’t understand until we opened our eyes to all of it.”

James Tootle was born on May 1, 1938, in Garston, a village in Hertfordshire, northwest of London. His father, Stanley Tootle, died in 1940, during World War II, when he was struck by shrapnel while playing cards in the bay window of his house. His mother, Hilda (Reason) Tootle, died when he was a teenager.

His grandfather, Thomas Augustus Reason, raised James, who took his surname.

In 1962, he graduated from the University of Manchester with a degree in psychology. He received his doctorate in 1967 from the University of Leicester, where he taught and conducted research before joining the faculty at the University of Manchester in 1977.

He married Rea Jaari, a professor of psychology, in 1964. She survives him, along with their daughters, Paula Reason and Helen Moss, and three grandchildren.

Throughout his career, Professor Reason’s surname was a reliable source of levity.

“The word ‘reason’ is, of course, widely used in the English language, but it does not describe what Jim is rightly famous for, namely ‘error,’” Erik Hollnagel, the founding editor of the International Journal of Cognition, Technology and Work, wrote in the preface to Professor Reason’s autobiography. “Indeed, ‘error’ is almost the opposite of ‘reason.’”

Still, it made sense.

“Jim has certainly brought reason to the study of error,” he wrote.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *