Sorry, you need to enable JavaScript to visit this website.
Skip to main content

Getting Into Patient Safety: A Personal Story

Jeffrey B. Cooper, PhD | August 1, 2006 
View more articles from the same authors.
Save
Print

Perspective

My journey into patient safety began in 1972. It was born of serendipity enabled by the good fortune of extraordinary mentors, an environment that supported exploration and allowed for interdisciplinary teamwork, and my own intellectual curiosity. The opportunity to spend my formative years immersed in a real safety culture embedded safety in my soul. You should all be so lucky.

When I landed in the Anesthesia Bioengineering Unit at the Massachusetts General Hospital in 1972, I had no real sense of why I was there nor vision of what I was going to do. I was lured by someone who needed a special skill I had, but did nothing to set me on a growth path or secure my future. As that became obvious, I sought to distance myself from him and align myself with more collegial allies.

I was fortunate to have landed in the company of some very smart, generative people who helped rescue me from my situation. One of those was Ron Newbower, PhD, who became my long-time colleague, friend, and mentor, teaching me to think more critically with intense intellectual honesty, to write, and to take risks. Ward Reynolds (Renny) Maier, MD, was my medical guide, sharing his deep insights into how people do their work and why it's so easy for them to screw up when trapped in a system that is conspiring against them. Ed Trautman (now PhD) was an engineering genius who provided innovations and tools that were invaluable to our early work. At the helm of our department was Richard J. Kitz, MD. At first he was merely my chairman, but, over the years, he became a guiding spirit, a model of work ethic, commitment, and can-do sensibility, and in later years, a dear friend and guardian angel.

We stumbled into patient safety, a term that had yet to be coined. Carving a pumpkin at a Halloween party, I struck up a conversation with someone who invited me to speak at a NATO-sponsored conference on Human Factors in Healthcare (in 1974!). My lecture was entitled, "The Anesthesia Machine: An Accident Waiting to Happen." A listener, who worked at the American Institute for Research, approached me. He suggested that we could use the critical incident technique to actually study the kinds of errors I had spoken about. (This technique involves collecting and analyzing stories of pivotal events, either positive or negative, about any field of work. It had been and still is widely used to learn about attributes of jobs and, especially, used for errors and system failures.[1]) In thinking this through, we realized that we needed to better understand the perspectives of the frontline workers to make anesthesia significantly safer. So, we set out to interview anesthesiologists, residents, and nurse anesthetists, seeking to learn from them about their own mistakes or those they had observed. They told us about much more than simply equipment-related errors. In fact, most of the discussions were about all of the other kinds of things that can go wrong. We followed the serendipity, collecting and analyzing the events and seeking to find patterns. A few seminal publications helped to expose human error in anesthesia and its underlying causes as problems that needed to be fixed.(2-4)

That research was translated into action through the leadership of Ellison C. Pierce, Jr., known to all as "Jeep." Jeep was Chairman of the Anesthesia Department at what was then the Deaconess Hospital and, in 1983, President of the American Society of Anesthesiologists (ASA). His own life experiences, including the death of a friend's child as a result of an anesthetic error, motivated Jeep to seek solutions to the problems of avoidable anesthetic catastrophes. He and I met through his department's involvement in our research. As the early momentum to improve anesthesia safety grew, Jeep, Dick Kitz, and I organized an international meeting in 1984 to examine why these events happened and how to prevent them. Amid the controversies (e.g., was electronic monitoring of the ECG or oxygen saturation really necessary—many in the Australian anesthesia community thought not), Jeep conceived the idea of creating a foundation dedicated to patient safety. He had already created a Committee for Risk Management and Patient Safety in the ASA—as best as I can tell, that was the first use of the term "patient safety." When we formed the foundation, we decided to call it exactly what it was to be: The Anesthesia Patient Safety Foundation (APSF).

Despite Jeep's leadership, the publication of data pointing to major problems in anesthesia safety and a 1982 episode of ABC's 20/20 that dramatized deaths and severe brain damage from anesthesia catastrophes, APSF may not have been viable had it not been for something else: a malpractice crisis that was markedly reducing the incomes of anesthesiologists. Although many wanted to lobby for tort reform to limit the size of malpractice awards, Jeep, during his ASA presidency in 1983, called for action to prevent the events that were leading to the exorbitant malpractice awards. The argument, focusing on decreasing harm rather than merely decreasing payments, won the day. I learned an important lesson: enlightened, risk-taking leadership is the only way to achieve significant culture change in situations in which the survival of the individuals is not directly threatened (as in an ice age, meteor, or perhaps global warming).

This focus on safety led to significant decreases in risks to patients. Some of the gains have come through changes in culture (for instance, assigning more value to safety concerns and paying greater attention to minor events that are harbingers of bigger problems); others through the thoughtful adoption of new procedures (for instance, checkout of equipment before procedures, protocols for handoffs during relief of one anesthetist by another); and still others through the use of safer technologies and tools (such as pulse oximetry, capnometry, and fiberoptic bronchoscopy). The progress, far beyond that seen in other medical specialties (at least at that time) can be attributed to many factors: the availability of data (both qualitative and quantitative) that identified many of the problems, appropriate consideration of the human factors that lead to system failures (although we didn't call them that then), financial incentives to motivate change (largely in the form of high malpractice premiums), the availability of some key technologies to enable better safety behaviors (particularly the technologies mentioned above), the recruitment and education of an increasing better trained workforce, the availability of safer and more controllable drugs, and the presence of leaders who were willing to take risks and who appreciated that safety had to be the core mission of their organizations.

We were fortunate in anesthesia to have all of these forces come together, leading to remarkable gains in safety. Safety in anesthesia is still not absolute, however, and high-risk patients still suffer far too many adverse events. Moreover, the safety gains we have achieved are under constant challenge from the pressures of production and the demands for efficiencies and cost reductions. When it comes to safety, the battle is never over.

Yet, those who have lived through the past few decades of anesthesia's evolution know that the deaths and morbid outcomes attributable to clearly preventable anesthesia causes in healthy patients undergoing elective procedures are now much more rare. I see no reason that every medical specialty organization should not be achieving the same results, either alone or in partnership with others.

As I think about my own contribution to this work, the desire to improve safety was incubated during my 3 years as an engineering coop student at E.I. Dupont Demours. Until recently, and only because of the relentless efforts toward safety at Alcoa, Dupont had the best record of worker safety of any company in the United States. That record resulted by design and persistent action. Dupont began as a company that made gunpowder. The owners lived on the grounds of the first factory. An explosion was not good for business nor for their life expectancy.

Working in several different parts of a plant in Philadelphia, I was infected with an obsession about safety. Every Monday morning, we held a brief safety meeting, rotating responsibilities for presenting a safety topic. We had a longer meeting every month. Those meetings, and the overall milieu of constant safety vigilance, are the reasons I can't walk by a file cabinet that someone has left open without feeling compelled to close it or walk past a cord snaking across the floor without wishing I had gaffers tape to cover it with. Perhaps most influential was the imposing billboard we all could not help but read on our way through the entrance gates. Its message was simple: it chronicled the number of days there had been since anyone at our plant had lost a day due a work-related injury. For as long as I was there, it was a large number, often in the hundreds. None of us wanted to be part of seeing it reset to zero (perhaps some people did come back earlier to work than they should have to prevent the resetting of the clock, an interesting example of the oft-seen perverse effects induced by cultural norms).

To this day, I often ask audiences, why can't we have large signs in every health care facility lobby, stating not just how many days since a worker had lost a day, but also how many days it has been since a patient was injured or killed due to something that we should have prevented. Crazy idea? Maybe. But, think of the effect it would have on all of us, especially when the initial signs in large hospitals will probably have hours, not days, as the tally.

In 30 plus years since I started down this path, I'm not sure how far we've come. We have many of the same problems we had a generation ago, and we've added some new ones: making changes without appreciating how work gets done, a system that gets more complicated as we try to solve more health care problems, politicians seeking quick solutions instead of demonstrating patience in tackling complex issues, the inability to count what really matters, and a legal system that hampers honesty. Most important, human nature hasn't changed: we all make mistakes and have trouble admitting and learning from them, mostly because we're embarrassed, ashamed, and too busy every moment to fix our microsystems, and we're unwilling or unable to change our behaviors. Too many continue to blame decent, hardworking people too much, and we still don't involve patients enough in the solutions.

However, I see things changing. The ideas of systems thinking and the importance of human factors are now more generally appreciated. Some institutions and administrators now resist the instinct to blame first and ask questions later. I see leaders across the landscape. There is action. I have hope.

Jeffrey B. Cooper, PhDDirector, Biomedical Engineering, Partners HealthCare System, Inc.Associate Professor of Anesthesia, Harvard Medical SchoolDepartment of Anesthesia and Critical Care, Massachusetts General HospitalDirector, Center for Medical Simulation, Cambridge, MA

References

Back to Top

1. Flanagan JC. The critical incident technique. Psychol Bull. 1954;51:327-358. [go to PubMed]

2. Cooper JB, Newbower RS, Long CD, McPeek B. Preventable anesthesia mishaps: a study of human factors. Anesthesiology. 1978;49:399-406. [go to PubMed]

3. Cooper JB, Long CD, Newbower RS, Philip JH. Critical incidents associated with intraoperative exchanges of anesthesia personnel. Anesthesiology. 1982;56:456-461. [go to PubMed]

4. Cooper JB, Newbower RS, Kitz RJ. An analysis of major errors and equipment failures in anesthesia management: considerations for prevention and detection. Anesthesiology. 1984;60:34-42. [go to PubMed]

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Sections
Related Resources From the Same Author(s)
Related Resources