Near Miss
An event or situation that did not produce patient injury, but only because of chance. This good fortune might reflect robustness of the patient (e.g., a patient with penicillin allergy receives penicillin, but has no reaction) or a fortuitous, timely intervention (e.g., a nurse happens to realize that a physician wrote an order in the wrong chart). This definition is identical to that for close call.

View Patient Safety Primer
Never Events
See Primer. The list of never events has expanded over time to include adverse events that are unambiguous, serious, and usually preventable. While most are rare, when never events occur, they are devastating to patients and indicate serious underlying organizational safety problems.

Normal Accident Theory
Though less often cited than high reliability theory in the health care literature, normal accident theory has played a prominent role in the study of complex organizations. In contrast to the optimism of high reliability theory, normal accident theory suggests that, at least in some settings, major accidents become inevitable and, thus, in a sense, "normal."

Perrow proposed two factors that create an environment in which a major accident becomes increasingly likely over time: complexity and tight coupling. The degree of complexity envisioned by Perrow occurs when no single operator can immediately foresee the consequences of a given action in the system. Tight coupling occurs when processes are intrinsically time-dependent–once a process has been set in motion; it must be completed within a certain period of time. Importantly, normal accident theory contends that accidents become inevitable in complex, tightly coupled systems regardless of steps taken to increase safety. In fact, these steps sometimes increase the risk for future accidents through unintended collateral effects and general increases in system complexity.

Even if one does not believe the central contention of normal accident theory–that the potential for catastrophe emerges as an intrinsic property of certain complex systems–analyses informed by this theory's perspective have offered some fascinating insights into possible failure modes for high-risk organizations, including hospitals.

Normalization of Deviance
Normalization of deviance was coined by Diane Vaughan in her book The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, in which she analyzes the interactions between various cultural forces within NASA that contributed to the Challenger disaster. Vaughn used this expression to describe the gradual shift in what is regarded as normal after repeated exposures to "deviant behavior" (behavior straying from correct [or safe] operating procedure). Corners get cut, safety checks bypassed, and alarms ignored or turned off, and these behaviors become normal—not just common, but stripped of their significance as warnings of impending danger. In their discussion of a catastrophic error in health care, Chassin and Becher used the phrase "a culture of low expectations." When a system routinely produces errors (paperwork in the wrong chart, major miscommunications between different members of a given health care team, patients in the dark about important aspects of the care), providers in the system become inured to malfunction. In such a system, what should be regarded as a major warning of impending danger is ignored as a normal operating procedure.

Back to Top