See Primer. The process when one health care professional updates another on the status of one or more patients for the purpose of taking over their care. Typical examples involve a physician who has been on call overnight telling an incoming physician about patients she has admitted so he can continue with their ongoing management, know what immediate issues to watch out for, and so on. Nurses similarly conduct a handover at the end of their shift, updating their colleagues about the status of the patients under their care and tasks that need to be performed. When the outgoing nurses return for their next duty period, they will in turn receive new updates during the change of shift handover.
Handovers in care have always carried risks: a professional who spent hours assessing and managing a patient, upon completion of her work, provides a brief summary of the salient features of the case to an incoming professional who typically has other unfamiliar patients he must get to know. The summary may leave out key details due to oversight, exacerbated by an unstructured process and being rushed to finish work. Even structured, fairly thorough summaries during handovers may fail to capture nuances that could subsequently prove relevant.
In addition to handoffs between professionals working in the same clinical unit, shorter lengths of stay in hospitals and other occupancy issues have increased transitions between settings, with patients more often move from one ward to another or from one institution to another (e.g., from an acute care hospital to a rehabilitation facility or skilled nursing facility). Due to the increasing recognition of hazards associated with these transitions in care, the term "handovers" is often used to refer to the information transfer that occurs from one clinical setting to another (e.g., from hospital to nursing home) not just from one professional to another.
See Primer. Although long accepted by clinicians as an inevitable hazard of hospitalization, recent efforts demonstrate that relatively simple measures can prevent the majority of health care–associated infections. As a result, hospitals are under intense pressure to reduce the burden of these infections.
Individuals' ability to find, process, and comprehend the basic health information necessary to act on medical instructions and make decisions about their health. Numerous studies have documented the degree to which numerous patients do not understand basic information or instructions related to general aspects of their medical care, their medications, and procedures they will undergo. The limited ability to comprehend medical instructions or information in some cases reflects obvious language barriers (e.g., reviewing medication instructions in English with a patient who speaks very little English), but the scope of the problem reflects broader issues related to levels of education, cross-cultural issues, and overuse of technical terminology by clinicians.
Loosely defined or informal rules often arrived at through experience or trial and error that make assessments and decisions (e.g., gastrointestinal complaints that wake patients up at night are unlikely to be benign in nature). Heuristics provide cognitive shortcuts in the face of complex situations, and thus serve an important purpose. Unfortunately, they can also turn out to be wrong, with frequently used heuristics often forming the basis for the many cognitive biases, such as anchoring bias, availability bias, confirmation bias, and others, that have received attention in the literature on diagnostic errors and medical decision making.
See Primer. High reliability organizations refer to organizations or systems that operate in hazardous conditions but have fewer than their fair share of adverse events. Commonly discussed examples include air traffic control systems, nuclear power plants, and naval aircraft carriers. It is worth noting that, in the patient safety literature, HROs are considered to operate with nearly failure-free performance records, not simply better than average ones. This shift in meaning is somewhat understandable given that the failure rates in these other industries are so much lower than rates of errors and adverse events in health care. This comparison glosses over the difference in significance of a "failure" in the nuclear power industry compared with one in health care. The point remains, however, that some organizations achieve consistently safe and effective performance records despite unpredictable operating environments or intrinsically hazardous endeavors. Detailed case studies of specific HROs have identified some common features, which have been offered as models for other organizations to achieve substantial improvements in their safety records. These features include:
Preoccupation with failure—the acknowledgment of the high-risk, error-prone nature of an organization's activities and the determination to achieve consistently safe operations.
Commitment to resilience—the development of capacities to detect unexpected threats and contain them before they cause harm, or bounce back when they do.
Sensitivity to operations—an attentiveness to the issues facing workers at the frontline. This feature comes into play when conducting analyses of specific events (e.g., frontline workers play a crucial role in root cause analyses by bringing up unrecognized latent threats in current operating procedures), but also in connection with organizational decision-making, which is somewhat decentralized. Management units at the frontline are given some autonomy in identifying and responding to threats, rather than adopting a rigid top-down approach.
A culture of safety, in which individuals feel comfortable drawing attention to potential hazards or actual failures without fear of censure from management.
In a very general sense, hindsight bias relates to the common expression "hindsight is 20/20." This expression captures the tendency for people to regard past events as expected or obvious, even when, in real time, the events perplexed those involved. More formally, one might say that after learning the outcome of a series of events—whether the outcome of the World Series or the steps leading to a war—people tend to exaggerate the extent to which they had foreseen the likelihood of its occurrence.
In the context of safety analysis, hindsight bias refers to the tendency to judge the events leading up to an accident as errors because the bad outcome is known. The more severe the outcome, the more likely that decisions leading up to this outcome will be judged as errors. Judging the antecedent decisions as errors implies that the outcome was preventable. In legal circles, one might use the phrase "but for," as in "but for these errors in judgment, this terrible outcome would not have occurred." Such judgments return us to the concept of "hindsight is 20/20." Those reviewing events after the fact see the outcome as more foreseeable and therefore more preventable than they would have appreciated in real time.
The Health Insurance Portability and Accountability Act of 1996 (HIPAA)
contains new federal regulations intended to increase privacy and security of
patient information during electronic transmission or communication of
"protected health information" (PHI) among providers or between providers and
payers or other entities.
"Protected health information" (PHI) includes all medical records and other
individually identifiable health information. "Individually identifiable
information" includes data that explicitly linked to a patient as well as
health information with data items with a reasonable potential for allowing
HIPAA also requires providers to offer patients certain rights with respect to
their information, including the right to access and copy their records and the
right to request amendments to the information contained in their records.
Administrative protections specified by HIPAA to promote the above regulations
and rights include requirements for a Privacy Officer and staff training
regarding the protection of patients’ information.