Cases & Commentaries

Feeling No Pain

Commentary By Marilyn Sue Bogner, PhD

The Case

A 33-year-old female underwent hysterectomy for
refractory endometriosis. For pain post-operatively, the patient
was placed on a Patient-Controlled Analgesia (PCA) pump containing
morphine sulfate. Three hours after her transfer to a gynecology
floor, the patient began complaining of severe frontal headache,
nausea, and vomiting. Nurses attributed these symptoms to
post-operative pain and the effects of the morphine. The PCA pump
was continued, despite her continued complaints of headache and
relatively mild incisional discomfort. The patient became
progressively drowsier and her respiratory status declined.
Luckily, the pulse oximeter alarm was activated when her oxygen
saturation fell, and the PCA pump was discontinued. At the time of
the discontinuation, the patient's O2 saturation level was 76%. At
that point, clinicians realized that she was actually suffering
from an adverse reaction to morphine. Further investigation of the
patient's prior medical history revealed similar complaints with
prior administration of meperidine hydrochloride (Demerol), but
these had not been noticed in her chart, elicited on pre-operative
history, nor flagged as an adverse reaction or
“allergy.”

The Commentary

What does this case tell us about creating a safe
environment for the control of post-operative pain? The nurses
initially attributed the patient’s problems to effects of the
morphine and did nothing about it until the pulse oximeter signaled
that the patient was in distress. Similarly, the person responsible
for taking the history might be faulted for not probing
sufficiently to uncover that the patient previously had trouble
with Demerol. Those problems most likely would be considered errors
of omission by the individual nurses, rather than allowing a case
like this to challenge the underlying system for pain
management.

If this incident were attributed simply to errors
of omission, what would it tell us about medical error? What
information does it provide about why the incident occurred that
could be used to prevent similar incidents from occurring? The
answer to each of those questions is, “Nothing.” What
it does indicate is that the cause of an incident usually is
attributed to the individual associated with it, generally the care
provider.(1) In
this case, the patient’s use of the PCA pump is another
implied cause, which will be discussed later.

Although errors have been described by a variety
of terms,(2,3)
error needs to be understood as an act or behavior. As a behavior,
error is subject to the empirical findings and theory of
psychology. This understanding is critical to the study of error
and patient safety because a basic tenet of psychology is that
behavior reflects the interaction between the person and the
environment.(4)
Thus, identifying the individual care provider as the sole source
of error is both incomplete and misleading. Trying to tackle the
all-encompassing environment to determine factors that contribute
to error, however, can present a daunting, if not impossible,
task.

The systems approach (5-7) dissects the environment into manageable segments
and serves as a tool for identifying factors that contribute to
error. In this approach, factors (evidence-based when possible) are
clustered into categories that meet the criteria for a system [a
complex of interacting factors(8)]
and affect the behavior of the task performer (here the care
provider). The eight categories are: (i) ambient conditions, (ii)
physical environment, (iii) social environment, (iv) organizational
factors, (v) the overarching system of legal, regulatory, national
culture, and reimbursement factors (these first five relate to the
environment or context of care), as well as the basic
care-providing system comprised of (vi) the care provider, (vii)
the means of providing care, and (viii) the patient. The context of
care categories are represented as concentric circles with the care
providing systems in the center.(9-11)
In an effort to promote understanding among providers for whom
systems-based thinking represents a paradigm shift from their
individual-oriented training and socialization, the systems
approach is likened to an artichoke, thus adding another
error-related food model to the smorgasbord of Swiss cheese
(2) and
onion.(12)
The leaves of the systems-approach artichoke (Figure) are the context of care components, while
the basic care-providing systems are the
artichoke’s.(13)
Because the application of this model forces us to consider each
category for error-provoking factors, it is a powerful tool for
incident analysis.

Consideration of the basic care-providing systems
in this case provides useful insights. The nurse’s focus
appears to be on the means of providing care, the PCA pump, and
ensuring that the patient was using this piece of technology
appropriately. If that focus were on the patient—with the PCA
pump considered merely as a means to an end—she would have
been more likely to heed what the patient was saying and be alert
to the adverse drug
reaction. The pervasive attitude that technology is the
solution to all problems can compromise patient safety because it
focuses attention and reliance on the means of providing
care, rather than on the patient. That tends to minimize the vital
focus on listening to the patient and following up on her
complaints, and overemphasize the role of the machines. The
availability of a PCA pump does not mean it should be used in all
pain management situations; PCA pumps may not be appropriate for
post-operative use, as in this case. Mindless use of PCA pumps in
every case driven in part by the “technological
imperative”(14) is
dangerous, particularly for patients whose cognitive or physical
abilities are impaired by illness, medication, or anxiety. However,
PCA pumps can be highly beneficial, particularly in an alert,
motivated patient.

In examining this case, we also need to consider
why the nurse did not act promptly to address the patient’s
problems. The nurse could indicate that fatigue was a factor and
investigation might reveal that the workload was heavy or the nurse
was assigned double shifts. Why did that occur? Most analyses of
error determine that the hospital, by assigning the work schedule,
was responsible and stop the analysis at this point (the
organizational level). The systems approach drives us to address
overarching societal level factors. Why did the hospital assign
such a demanding work schedule? The answer identifies the actual
perpetrator of the incident--the fiscal constraints created by
reimbursement policies.

To be effective across conditions, efforts to
reduce the likelihood of an incident such as this one should be
directed to the source of the problem, reimbursement. Although
reimbursement policies and other factors in the overarching system
are the most difficult to change, instances like this should be
documented and data assembled and provided to those that might
facilitate change. In that way, the error-inducing impact of
reimbursement policies can be made known to those affected by
error, such as companies that insure hospitals and care providers,
professional organizations that have means of influencing policy
makers, and the general public who can influence policy makers by
their votes. Recognizing that such changes are slow to occur and
that care continues in the meantime, knowledge that the
incident’s source may be fiscal constraints and not
exclusively inherent to the care provider is incentive to devise
ways to reduce other identified error-provoking conditions.

Efforts can be undertaken to reduce excessive
reliance on technology (the PCA pump and the pulse oximeter in this
case) and increase the exercise of clinical skills such as
listening to the patient. The importance of responding to what the
patient says could be emphasized by training. Such education might
include role-playing, in which staff members “become” a
patient whose complaints are ignored, and/or a program that
identifies and publicly acknowledges staff members who demonstrate
effective clinical skills by listening to patients and responding
to their concerns.

Thus, the application of the systems approach
enables us to understand that the sources of the error in this case
are multidimensional: involving a prevailing reverence for
technology over listening to the patient, and the impact of
reimbursement policies. Because this analysis was limited to the
material in the case report, it is less comprehensive than it would
be in a real-life situation. Even so, we see that the tool yields
significant insights by identifying error-provoking factors, which
would be unknown in its absence. Such factors can promote the
enhancement of patient safety. The orientation toward error needs
to shift from reacting to the clinical aspects of a case--those
that are the most immediately compelling, but tend to distract from
the real issues--to preventing similar situations in the
future.

The message of the artichoke–systems
approach is that an act by an individual care provider, which is
judged to be an error, reflects factors in the context of care that
contribute to, if not provoke, the act. The context of care is
analogous to the script in a play in which the care provider is an
actor. Actors who fit the role can be changed and yet the outcome
of their performance will be the same, because actors respond to
the script. To change the outcome of their performance, the script
must be changed. Thus, when analyzing an incident, rather than
focusing solely on the individual care provider-actor, the
systems-approach artichoke should be peeled to identify
error-inducing factors in the context of care script. Enhancing
patient safety by effectively reducing the likelihood of error
necessitates that actions be taken. Factors identified as
contributing to error are the focus for such actions.

Marilyn
Sue Bogner, PhD
President and Chief Scientist
Institute for the Study of Human Error, LLC
Bethesda, Maryland

References

1. Bogner MS. Understanding Human Error. In:
Bogner, MS, ed. Misadventure in health care: inside stories.
Mahwah, NJ: Lawrence Erlbaum Associates; 2003.

2. Reason JT. Human error. New York: Cambridge
University Press; 1990.

3. Rasmussen J. Human errors: A taxonomy for
describing human malfunction in industrial installations. Journal
of Occupational Accidents. 1982;4:311-333.

4. Lewin K. Principles of topological psychology.
New York: McGraw-Hill; 1936/1966.

5. Bogner MS. Human error in medicine: a frontier
for change. In: Bogner, MS, ed. Human error in medicine. Mahwah,
NJ: Lawrence Erlbaum Associates; 1994:373-383.

6. Bogner MS. Error: it’s what, not who.
Trauma Care. 1998;8:82-84.

7. Bogner MS. A systems approach to medical
error. In: Vincent C and De Mol B, eds. Safety in Medicine.
Amsterdam: Pergamon; 2000:83-101

8. von Bertalanffy L. General systems theory. New
York: George Braziller; 1968.

9. Bogner MS. Stretching the search for the
“why” of error: the systems approach. Journal of
Clinical Engineering. 2002;27:110-115.

10. Bogner MS. Identifying error provoking
factors: the systems approach. Keynote address to: The conference
on Minimizing the risk of medical errors: focus on the system;
November 2002; Toronto, Canada.

11. Bogner MS. The systems approach analysis of
error. Plenary address to: Workshop on the Investigation and
reporting of incidents and accidents; July 2002; Glasgow,
Scotland.

12. Moray N. Error reduction as a systems
problem. In: Bogner MS, ed. Human error in medicine. Mahwah NJ:
Lawrence Erlbaum Associates; 1994:67-91.

13. Bogner MS. Understanding human error. In:
Bogner MS, ed. Misadventures in health care: inside stories.
Mahwah, NJ: Lawrence Erlbaum Associates; 2003.

14. Fuchs VR. Who shall live? Health, Economics,
and Social Choice. Basic Books; 1983.

Figure

Figure. The Systems Approach: Artichoke
Model


The leaves of the systems approach artichoke are
the constituent systems of the context of care, and the basic
care-providing systems are the heart of the artichoke.