Sorry, you need to enable JavaScript to visit this website.
Skip to main content

Human Factors Engineering

Save
Print
September 7, 2019

Background

An obstetric nurse connects a bag of pain medication intended for an epidural catheter to the mother's intravenous (IV) line, resulting in a fatal cardiac arrest. Newborns in a neonatal intensive care unit are given full-dose heparin instead of low-dose flushes, leading to three deaths from intracranial bleeding. An elderly man experiences cardiac arrest while hospitalized, but when the code blue team arrives, they are unable to administer a potentially life-saving shock because the defibrillator pads and the defibrillator itself cannot be physically connected.

Busy health care workers rely on equipment to carry out life-saving interventions, with the underlying assumption that technology will improve outcomes. But as these examples illustrate, the interaction between workers, the equipment, and their environment can actually increase the risk of disastrous errors. Each of these safety hazards ultimately was attributed to a relatively simple, yet overlooked problem with equipment design. The bag of epidural anesthetic was similar in size and shape to IV medication bags, and, crucially, the same catheter could access both types of bags. Full-dose and prophylactic-dose heparin vials appear virtually identical, and both concentrations are routinely stocked in automated dispensers at the point of care. Multiple brands of defibrillators exist that differ in physical appearance as well as functionality; a typical hospital may have many different models scattered around the building, sometimes even on the same unit.

Human factors engineering is the discipline that attempts to identify and address these issues. It is the discipline that takes into account human strengths and limitations in the design of interactive systems that involve people, tools and technology, and work environments to ensure safety, effectiveness, and ease of use. A human factors engineer examines a particular activity in terms of its component tasks, and then assesses the physical demands, skill demands, mental workload, team dynamics, aspects of the work environment (e.g., adequate lighting, limited noise, or other distractions), and device design required to complete the task optimally. In essence, human factors engineering focuses on how systems work in actual practice, with real—and fallible—human beings at the controls, and attempts to design systems that optimize safety and minimize the risk of error in complex environments.

Human factors engineering has long been used to improve safety in many industries outside of health care—it has been employed to analyze errors in aviation, automobiles, and the Three Mile Island nuclear power plant accident. Its application to health care is relatively recent; pioneering studies of human factors in anesthesia were integral to the redesign of anesthesia equipment, significantly reducing the risk of injury or death in the operating room.

Applications of Human Factors Engineering to Improving Safety

The very nature of human factors engineering precludes "one size fits all" solutions, but several tools and techniques are commonly used as human factors approaches to addressing safety issues.

Usability testing—Human factors engineers test new systems and equipment under real-world conditions as much as possible, in order to identify potential problems and unintended consequences of new technology. One prominent example of the clinical applicability of usability testing involves electronic medical records and computerized provider order entry (CPOE). A recent book discussed a serious medication overdose that occurred in part due to confusing displays in the institution's CPOE system—a vivid example of how failing to use human factors engineering principles in user interface design can potentially harm patients. Simulated clinical scenarios may be used to conduct usability testing, as was performed in a study that demonstrated that commercial CPOE systems generally did not detect potentially unsafe orders.

Usability testing is also essential for identifying workarounds—the consistent bypassing of policies or safety procedures by frontline workers. Workarounds frequently arise because of flawed or poorly designed systems that actually increase the time necessary for workers to complete a task. As a result, frontline personnel work around the system in order to get work done efficiently. In the obstetric example above, the hospital had implemented a bar-code system designed to prevent medication administration errors. However, the system did not reliably scan IV bags. Nurses therefore developed a workaround for urgent situations, whereby they would administer the IV medication without scanning the bar code, and only later manually document its administration. This workaround was deemed to be a substantial contributor to the ultimately fatal error.

Forcing functions—An aspect of a design that prevents an unintended or undesirable action from being performed or allows its performance only if another specific action is performed first. For example, automobiles are now designed so that the driver cannot shift into reverse without first putting his or her foot on the brake pedal. Forcing functions need not involve device design. One of the first forcing functions identified in health care was the removal of concentrated potassium from general hospital wards. This action helps prevent the inadvertent addition of concentrated potassium to intravenous solutions prepared by nurses on the wards, an error that has produced small but consistent numbers of deaths for many years.

Standardization—An axiom of human factors engineering is that equipment and processes should be standardized whenever possible, in order to increase reliability, improve information flow, and minimize cross-training needs. Standardizing equipment across clinical settings (as in the defibrillator example above) is one basic example, but standardized processes are increasingly being implemented as safety measures. The widening use of checklists as a means of ensuring that safety steps are performed in the correct order has its roots in human factors engineering principles.

Resiliency efforts—Given that unexpected events are likely to occur, attention needs to be given to their detection and mitigation before they worsen. Rather than focus on error and design efforts to preclude it, resiliency approaches tap into the dynamic aspects of risk management, exploring how organizations anticipate and adapt to changing conditions and recover from system anomalies. Building on insights from high-reliability organizations, complex adaptive systems, and resourceful providers at the point of care, resilience is viewed as a critical system property, reflecting the organization's capacity to bounce back in the face of continuing pressures and challenges when the margins of safety have become thin.

Despite the above examples, it is generally agreed that human factors principles are underutilized in examination of safety problems and in designing potential solutions. The ever-lengthening list of unintended consequences of CPOE can, in part, be viewed as a failure to appropriately design such systems with human factors in mind.

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Related Patient Safety Primers
Related Resources
Patient Safety Primers
Burnout