Glossary

S

View Patient Safety Primer
Safety Culture
See Primer. High-reliability organizations consistently minimize adverse events despite carrying out intrinsically hazardous work. Such organizations establish a culture of safety by maintaining a commitment to safety at all levels, from frontline providers to managers and executives.

Sensemaking
A term from organizational theory that refers to the processes by which an organization takes in information to make sense of its environment, to generate knowledge, and to make decisions. It is the organizational equivalent of what individuals do when they process information, interpret events in their environments, and make decisions based on these activities. More technically, organizational sensemaking constructs the shared meanings that define the organization's purpose and frame the perception of problems or opportunities that the organization needs to work on.

View Patient Safety Primer
Sentinel Event
See Primer. An adverse event in which death or serious harm to a patient has occurred; usually used to refer to events that are not at all expected or acceptable—e.g., an operation on the wrong patient or body part. The choice of the word sentinel reflects the egregiousness of the injury (e.g., amputation of the wrong leg) and the likelihood that investigation of such events will reveal serious problems in current policies or procedures.

Sharp End
The sharp end refers to the personnel or parts of the health care system in direct contact with patients. Personnel operating at the sharp end may literally be holding a scalpel (e.g., an orthopedist who operates on the wrong leg) or figuratively be administering any kind of therapy (e.g., a nurse programming an intravenous pump) or performing any aspect of care. To complete the metaphor, the blunt end refers to the many layers of the health care system that affect the scalpels, pills, and medical devices, or the personnel wielding, administering, and operating them. Thus, an error in programming an intravenous pump would represent a problem at the sharp end, while the institution's decision to use multiple types of infusion pumps (making programming errors more likely) would represent a problem at the blunt end. The terminology of "sharp" and "blunt" ends corresponds roughly to active failures and latent conditions.

View Patient Safety Primer
Signouts and Signovers
See Primer. The term "signout" is used to refer to the act of transmitting information about the patient. Handoffs and signouts have been linked to adverse clinical events in settings ranging from the emergency department to the intensive care unit.
Situational Awareness
Situational awareness refers to the degree to which one's perception of a situation matches reality. In the context of crisis management, where the phrase is most often used, situational awareness includes awareness of fatigue and stress among team members (including oneself), environmental threats to safety, appropriate immediate goals, and the deteriorating status of the crisis (or patient). Failure to maintain situational awareness can result in various problems that compound the crisis. For instance, during a resuscitation, an individual or entire team may focus on a particular task (a difficult central line insertion or a particular medication to administer, for example). Fixation on this problem can result in loss of situational awareness to the point that steps are not taken to address immediately life-threatening problems such as respiratory failure or a pulseless rhythm. In this context, maintaining situational awareness might be seen as equivalent to keeping the big picture in mind. Alternatively, in assigning tasks in a crisis, the leader may ignore signals from a team member, which may result in escalating anxiety for the team member, failure to perform the assigned task, or further patient deterioration.

Six Sigma
Six sigma refers loosely to striving for near perfection in the performance of a process or production of a product. The name derives from the Greek letter sigma, often used to refer to the standard deviation of a normal distribution. By definition, 95% of a normally distributed population falls within 2 standard deviations of the average (or "2 sigma"). This leaves 5% of observations as "abnormal" or "unacceptable." Six Sigma targets a defect rate of 3.4 per million opportunities—6 standard deviations from the population average.

When it comes to industrial performance, having 5% of a product fall outside the desired specifications would represent an unacceptably high defect rate. What company could stay in business if 5% of its product did not perform well? For example, would we tolerate a pharmaceutical company that produced pills containing incorrect dosages 5% of the time? Certainly not. But when it comes to clinical performance—the number of patients who receive a proven medication, the number of patients who develop complications from a procedure—we routinely accept failure or defect rates in the 2% to 5% range, orders of magnitude below Six Sigma performance.

Not every process in health care requires such near-perfect performance. In fact, one of the lessons of Reason's Swiss cheese model is the extent to which low overall error rates are possible even when individual components have many "holes." However, many high-stakes processes are far less forgiving, since a single "defect" can lead to catastrophe (e.g., wrong-site surgery, accidental administration of concentrated potassium).

Slips (or Lapses)
Errors can be dichotomized as slips or mistakes, based on the cognitive psychology of task-oriented behavior. Slips refer to failures of schematic behaviors, or lapses in concentration (e.g., overlooking a step in a routine task due to a lapse in memory, an experienced surgeon nicking an adjacent organ during an operation due to a momentary lapse in concentration).

Slips occur in the face of competing sensory or emotional distractions, fatigue, and stress. Reducing the risk of slips requires attention to the designs of protocols, devices, and work environments—using checklists so key steps will not be omitted, reducing fatigue among personnel (or shifting high-risk work away from personnel who have been working extended hours), removing unnecessary variation in the design of key devices, eliminating distractions (e.g., phones) from areas where work requires intense concentration, and other redesign strategies. Slips can be contrasted with mistakes, which are failures that occur in attentional behavior such as active problem solving.

Standard of Care
What the average, prudent clinician would be expected to do under certain circumstances. The standard of care may vary by community (e.g., due to resource constraints). When the term is used in the clinical setting, the standard of care is generally felt not to vary by specialty or level of training. In other words, the standard of care for a condition may well be defined in terms of the standard expected of a specialist, in which case a generalist (or trainee) would be expected to deliver the same care or make a timely referral to the appropriate specialist (or supervisor, in the case of a trainee). Standard of care is also a term of art in malpractice law, and its definition varies from jurisdiction to jurisdiction. When used in this legal sense, often the standard of care is specific to a given specialty; it is often defined as the care expected of a reasonable practitioner with similar training practicing in the same location under the same circumstances.

Structure-Process-Outcome Triad
Most definitions of quality emphasize favorable patient outcomes as the gold standard for assessing quality. In practice, however, one would like to detect quality problems without waiting for poor outcomes to develop in such sufficient numbers that deviations from expected rates of morbidity and mortality can be detected. Donabedian first proposed that quality could be measured using aspects of care with proven relationships to desirable patient outcomes. For instance, if proven diagnostic and therapeutic strategies are monitored, quality problems can be detected long before demonstrable poor outcomes occur.

Aspects of care with proven connections to patient outcomes fall into two general categories: process and structure. Processes encompass all that is done to patients in terms of diagnosis, treatment, monitoring, and counseling. Cardiovascular care provides classic examples of the use of process measures to assess quality. Given the known benefits of aspirin and beta-blockers for patients with myocardial infarction, the quality of care for patients with myocardial infarction can be measured in terms of the rates at which eligible patients receive these proven therapies. The percentage of eligible women who undergo mammography at appropriate intervals would provide a process-based measure for quality of preventive care for women.

Structure refers to the setting in which care occurs and the capacity of that setting to produce quality. Traditional examples of structural measures related to quality include credentials, patient volume, and academic affiliation. More recent structural measures include the adoption of organizational models for inpatient care (e.g., closed intensive care units and dedicated stroke units) and possibly the presence of sophisticated clinical information systems. Cardiovascular care provides another classic example of structural measures of quality. Numerous studies have shown that institutions that perform more cardiac surgeries and invasive cardiology procedures achieve better outcomes than institutions that see fewer patients. Given these data, patient volume represents a structural measure of quality of care for patients undergoing cardiac procedures.

Swiss Cheese Model
Reason developed the "Swiss cheese model" to illustrate how analyses of major accidents and catastrophic systems failures tend to reveal multiple, smaller failures leading up to the actual hazard.

In the model, each slice of cheese represents a safety barrier or precaution relevant to a particular hazard. For example, if the hazard were wrong-site surgery, slices of the cheese might include conventions for identifying sidedness on radiology tests, a protocol for signing the correct site when the surgeon and patient first meet, and a second protocol for reviewing the medical record and checking the previously marked site in the operating room. Many more layers exist. The point is that no single barrier is foolproof. They each have "holes"; hence, the Swiss cheese. For some serious events (e.g., operating on the wrong site or wrong person), even though the holes will align infrequently, even rare cases of harm (errors making it "through the cheese") will be unacceptable.

While the model may convey the impression that the slices of cheese and the location of their respective holes are independent, this may not be the case. For instance, in an emergency situation, all three of the surgical identification safety checks mentioned above may fail or be bypassed. The surgeon may meet the patient for the first time in the operating room. A hurried x-ray technologist might mislabel a film (or simply hang it backwards and a hurried surgeon not notice), "signing the site" may not take place at all (e.g., if the patient is unconscious) or, if it takes place, be rushed and offer no real protection. In the technical parlance of accident analysis, the different barriers may have a common failure mode, in which several protections are lost at once (i.e., several layers of the cheese line up).

In health care, such failure modes, in which slices of the cheese line up more often than one would expect if the location of their holes were independent of each other (and certainly more often than wings fly off airplanes) occur distressingly commonly. In fact, many of the systems problems discussed by Reason and others—poorly designed work schedules, lack of teamwork, variations in the design of important equipment between and even within institutions—are sufficiently common that many of the slices of cheese already have their holes aligned. In such cases, one slice of cheese may be all that is left between the patient and significant hazard.

View Patient Safety Primer
Systems Approach
See Primer. Medicine has traditionally treated quality problems and errors as failings on the part of individual providers, perhaps reflecting inadequate knowledge or skill levels. The systems approach, by contrast, takes the view that most errors reflect predictable human failings in the context of poorly designed systems (e.g., expected lapses in human vigilance in the face of long work hours or predictable mistakes on the part of relatively inexperienced personnel faced with cognitively complex situations). Rather than focusing corrective efforts on reprimanding individuals or pursuing remedial education, the systems approach seeks to identify situations or factors likely to give rise to human error and implement systems changes that will reduce their occurrence or minimize their impact on patients. This view holds that efforts to catch human errors before they occur or block them from causing harm will ultimately be more fruitful than ones that seek to somehow create flawless providers.

This systems focus includes paying attention to human factors engineering (or ergonomics), including the design of protocols, schedules, and other factors that are routinely addressed in other high-risk industries but have traditionally been ignored in medicine.

Back to Top