Sorry, you need to enable JavaScript to visit this website.
Skip to main content

The Safety of Medical Devices

Christopher Nemeth, PhD | June 1, 2011 
View more articles from the same authors.
Save
Print

Perspective

Edward Tenner is right. Technology does have reverberations, including unintended consequences, or "revenge effects."(1) While such drawbacks are inherent in technology, our poor understanding of technology in health care is a much larger problem. We don't know enough about products or systems that people use to know what can happen. Compared with ignorance, the revenge effects that are inherent in technology are minor.

When we talk about the safety of medical devices, "safe" implies an expectation that a device will keep us free from harm. How safe are medical devices? Eleven years after the Institute of Medicine report To Err is Human (2), there is ample evidence to indicate that issues with device safety remain substantial and widespread. Here are just a few examples.

  • Device recalls are the most valid current measure of device failure. I reviewed 1,573 medical device recalls for the U.S. Food and Drug Administration (USFDA) Center for Devices and Radiologic Health (CDRH) that occurred during January 2006–May 2008. Eight hundred ten (51.5%) had human factors (3) at issue and failed in one or more aspects of reliability, efficiency, or safety. For example, software malfunction prevented a defibrillator from delivering shocks when needed. The back of a patient chair on a tomography system bent and broke off, allowing a patient to fall. A ventricular assist device (VAD) permitted implantation of the wrong size nut, causing a poor connection with the inflow cannula that resulted in patient death (Nemeth C, unpublished data).
  • Even when the threat from a medical device is known, its solution can still be elusive. In April 2006, members of the surgical team placed an anesthetized patient onto a modular table during preparations for a spinal procedure. While the team adjusted the table, it swung loose and the patient fell to the floor but sustained no injury. The surgical department was aware of the table's flaws, but was willing to trade off the table's safety issues with the unit's features that they felt made it desirable to continue its use. Warranty and USFDA approval concerns precluded the hospital from modifying the equipment. The hospital developed a report of the event for the manufacturer. It also produced a brief improvement plan that included training surgical care team members and hanging a warning sign from the lever on the side of the table's head end control housing. The sign was not used. The table and its inherent safety problems remained in use.(4)
  • A recent article on the danger presented by interchangeable intravenous and feeding tubes (5) generated a series of simple, presumptive solutions (6) pointing in directions that could almost be predicted, from clinicians (label the tubes; don't replace trained practitioners with others who are less qualified), to a government agency (we're on it), and a trade organization (have the government mandate bar codes).

While notions about solutions are simple, the problem and its context are complex. In all of the above examples, the "device" does not stand alone, but is instead part of a larger system that influences its use and effects on patient care and outcomes. The Figure illustrates how an infusion "device," the most widely used information technology (IT) in health care, is actually an interdependent network of relationships. It's a socio-technical system that spans all who develop, supply, and use the result, from the level of the care provider or manufacturer organization, to associations and regulators, to government. This requires a different approach to safety: at the systems, not the device, level.

Complex systems have properties of operation and failure that require study.(7) Health care is a complex sector, with complex phenomena that make it more difficult to generate reliable evidence.(8) Health care continues to resist study for a number of reasons. Departments are separated and teams resist scrutiny. Organizations are pressured to maximize revenue by operating at, or near, saturation. Devices increase in complexity and new versions replace them frequently. Even if we did understand today's version of hardware or software, tomorrow's version will soon come along to replace it. The training that care providers receive about devices is protracted and designed to show what to do when things go right, and omits what to do when they go wrong. Reporting systems abound but lack data collection standards and robust evaluation processes that might make them useful. Too few clinicians question the effect a new technology may have on the patient. Too few manufacturers invest in understanding human performance and how to accommodate operator needs and abilities. Oversight by government agencies is impeded because they are obliged to respect what they are and are not authorized to do, such as oversee devices that are already in the market.

The safety of medical devices relies on well-informed attention and action at all of these levels. Being "well-informed" starts with having good data—a genuine understanding of the real world. As of now, we don't know what can go wrong when devices are in use because there are no data on how the elements in each system interact, and how that affects human performance. A 5-year study of infusion devices showed that junior and experienced clinicians alike routinely got "lost in menuspace" when operating these devices.(9) If the design of a single device such as an infusion pump is a challenge, consider interconnected software including computerized physician order entry (CPOE), electronic medical records (EMR), bar code medication administration (BCMA), and electronic medication administration record (EMAR). These systems are far more complex than a lone pump. They also blur boundaries within and across the organization that were previously clear-cut, and they can hide and propagate far more threats than a single infusion device.

These shortcomings can be avoided with good design, based on good research. Other industries such as aviation do this successfully, but it takes a significant investment to get the benefit. Moving forward starts with systems thinking at all levels: care providers, companies, regulators/associations, and government. That requires an understanding of how complex systems work (7), and how to authentically account for human performance.(10) It also requires what Sterman (11) described as understanding how all models are wrong, a humility about the limitations of our knowledge, and a commitment to the rigorous and disciplined use of scientific inquiry skills.

Without data on human cognitive performance in health care and how technology affects it, we have no information on what the real world of health care is like, and any efforts to improve it are just a collective guess.

Christopher Nemeth, PhD, CHFPPrincipal ScientistGroup Leader, Cognitive Systems EngineeringCognitive Solutions Division of Applied Research Associates, Inc.

Acknowledgement: The author is grateful to Robert Wears, MD; Shawna Perry, MD; Richard Cook, MD; and Jay Crowley for their insightful comments during the development of this essay.

References

 

1. Tenner E. Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York: Random House; 1997. ISBN: 9780679747567.

2. Kohn L, Corrigan J, Donaldson M, eds. To Err is Human: Building a Safer Health System. Washington, DC: National Academies Press; 2000. ISBN: 9780309068376.

3. Nemeth C. Human Factors Methods for Design: Making Systems Human-Centered. Boca Raton, FL: CRC Press; 2004. ISBN: 9780415297981. [Available at]

4. Nemeth C, Dierks M, Patterson E, et al. Learning from Investigation. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, San Francisco. October 2006. [Available at]

5. Harris G. U.S. inaction lets look-alike tubes kill patients. New York Times. August 21, 2010:A1.

6. Letters: How to reduce tube mix-ups in hospitals. New York Times. August 25, 2010:A26.

7. Cook RI. How complex systems fail. In Allspaw J, Robbins J, eds. Web Operations: Keeping the Data On Time. Sebastopol, CA: O'Reilly Media; 2010. ISBN: 9781449377441. [Available at]

8. Nunnally M, Nemeth C, Brunetti V, et al. Lost in Menuspace: User Interactions with Complex Medical Devices. In Nemeth C, Cook R, and Woods D, eds. Special Issue on Studies in Healthcare Technical Work. IEEE Transactions on Systems, Man and Cybernetics—Part A. 2004;34:736-742. [Available at]

 

9. Pew R, Mavor A, eds. Human System Integration in the System Development Process: A New Look. Committee on Human-System Design Support for Changing Technology. Washington, DC: The National Academies Press; 2007. ISBN: 9780309107204. [Available at]

10. Sterman JD. All models are wrong: Reflections on becoming a system scientist. Syst Dynamics Rev. 2002;18:501-531. [Available at]

 

11. Sterman JD. Learning from evidence in a complex world. Am J Pub Health. 2006;96:505-514. [go to PubMed]

12. Nemeth C, Cook R. The infusion device as a source of resilience. In Nemeth C, Hollnagel E, Dekker F, Dekker S, eds. Resilience Engineering Perspectives, 2. Farnham, UK: Ashgate Publishing; 2009. ISBN: 9780754675204. Preparation and Restoration; vol 2.

 

Figure

Infusion Device as a Socio-Technical System.(12)

 

(Go to figure citation in the text)

 

Click to enlarge.

Copyright © 2008 Cognitive Technologies Laboratory. Reprinted with permission.

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Sections
Related Resources From the Same Author(s)
Related Resources