Sorry, you need to enable JavaScript to visit this website.
Skip to main content

Techno Trip

Save
Print
Richard I. Cook, MD | March 1, 2005
View more articles from the same authors.

The Case

A 70-year-old woman was admitted to a community hospital after developing confusion and right-sided weakness. A CT scan of her brain showed an acute subdural hematoma. The hospital arranged a transfer to large referral center for urgent neurosurgical evaluation. The radiology department at the community hospital had recently implemented an electronic picture archiving and communication system (PACS). Instead of printed films, the patient was sent with a compact disk (CD) containing copies of relevant studies. On arrival at the referral center, a right-sided hemiparesis was confirmed on physical exam. The accepting surgeon inserted the CD into a local computer. The CT image that appeared on the screen showed some brain atrophy, small, old strokes, and a large left-sided subdural hygroma, but no acute hemorrhage. The surgeon felt that the patient had a stroke, admitted her to the stroke unit, and consulted neurology.

The next day, a consulting neurologist found a set of more recent images while scrolling through the PACS disk. These demonstrated the acute subdural hemorrhage for which the patient had been transferred. The subdural was urgently evacuated, and the patient improved after a prolonged period of rehabilitation.

The Commentary

This case illustrates a common safety problem (1): the loss of continuity of care during transfer. Here, information technology (IT) intended to bridge the gap in continuity actually contributed to the failure. Health care IT, such as electronic medical records, computerized provider order entry (CPOE), and bar coded medication administration (BCMA), is intended to reduce the incidence of accidents affecting patients. But new technology is also a potent source of new forms of failure, especially when the human–computer interaction characteristics of the technology are not carefully managed.(2,3)

The IT "Gap" Uncovered

Digital radiology (DR) represents a growing area of health care IT because of the emergence of numerous digital-format imaging modalities (eg, ultrasound, CT/MRI scanning, and PET scans). The economies of scale for electronic digital systems make it convenient to capture and process virtually every medical image digitally. The proliferation of these images can become a problem in itself, as practitioners struggle to manage the many different images generated for a single patient. In healthcare facilities, captured digital images are stored, transferred, and viewed across a network of computers called a "picture archive and communication system" or PACS.(4)

The PACS stores the data needed to reconstruct digital images without storing the actual images themselves. To view a particular imaging study, the practitioner selects the patient and specific image on a terminal connected to the PACS. The PACS retrieves the corresponding data and draws the image on the terminal's viewing screen. To make it possible for practitioners to view a patient's images on computers not connected to the PACS network (eg, at a different hospital), a CD or DVD containing the image data for an individual patient is created. The data on this disk is formatted according to a standard known as DICOM.(5,6) The purpose of this standard is to make data interchangeable between different systems, but not all PACS adhere to the standard, which is changed frequently to accommodate new technology. For this reason, it is common to include a small program called "DICOM reader" (7) on the disk. This program allows almost any computer to generate images from the data without using a PACS. It may be especially useful if a patient is transferred to an outside hospital. The disk may be created so that the DICOM reader program starts automatically and displays the images found on the disk, often defaulting to the oldest images first.

In this case, the accepting surgeon likely fell victim to the technology described above. The DICOM reader apparently displayed older and, in this instance, less relevant images. We do not know what sort of directory of images was available or if the image date or time was displayed, but we can infer that the surgeon only viewed one set of images.

Technology as a Source of New Failures

Just a few years ago, prior to development of digital radiology, this type of event was not possible. The patient would probably have arrived at the receiving hospital with either physical copies of the CT films showing an acute subdural hematoma (8) or without any films at all. If she had arrived without films, it is likely that a new head CT would have been obtained. In the current era, however, the same technology that makes high-quality information available is simultaneously an unexpected source of new forms of failure.

The source of this event is not human error on the part of the surgeon. Instead, the failure occurred because the IT design was technology-centered rather than human-centered.(9) Although the intention of the designers was for IT to support users in their work, getting this support requires the users to know a great deal about how to make this IT work. Some would claim that, because the IT performed as designed, it did not fail. This case demonstrates clearly, however, that failure can (and does!) arise from IT design.

As IT use increases and more access to information is channeled through IT, these problems will likely increase in number and severity. The new forms of failure that result from IT use in health care share common features (10): the failures are (i) less frequent but more consequential, (ii) more challenging to detect, and (iii) more difficult to defend against.

The reason for these qualitative differences in failures with IT can be traced to the nature of IT itself, and the way it centralizes the design and interaction features of the workplace. Manual systems and their filing and record-handling methods allowed high rates of relatively low-consequence failures. Handwritten records could be illegible and films misplaced. Repeated exposure to these problems led workers to devise relatively straightforward solutions, such as making extra copies, maintaining "shadow" charts, or redoing lost studies. These adaptations were inefficient, but they were usually easy to apply and not unduly burdensome. By preventing these sorts of failures, IT applications reduce the need for such workarounds. Indeed, preventing such failures is one reason that IT is now being deployed. But the deployment introduces new dependencies and opportunities for accidents.

Unlike the relatively common and easily understood failures that can occur in manual systems, these IT-based failures are infrequent but potentially severe. Defenses against these failures are more difficult to devise and less likely to be robust.(11) More importantly, the extended reach and power that IT provides means that the sources of failure, such as poor design, are far removed from the failures themselves in ways that encourage people to view IT-based failures as instances of human ("operator") error.(12,13)

The view that technology is relatively infallible leads failures at the man–machine interface to be attributed to human error (eg, "the surgeon should look more carefully") rather than to poor design.(14) Creating high quality, human-centered technology is difficult and requires substantial time and effort.(15) Few health care IT designs have received the necessary study and refinement that characterizes other fields in which IT design has been quite successful, such as in aircraft cockpit design. Significantly, design deficiencies identified by accidents such as this one tend not to lead directly to better designs but to patches intended to make up for the deficiencies. One can envision a software fix to change this particular DICOM reader so that it shows the most recent scan first, or so that a display of all available scans appears when the disk is inserted into a computer. It is more difficult to envision a mechanism to make this an industry-wide practice.

The Syndrome of IT Dependence

The details of digital radiology and IT are only one aspect of this interesting case. Equally important is the apparently casual way in which a critical diagnosis was discarded. One can imagine a brief telephone conversation between the neurosurgeon and the transferring provider. "I don't see a subdural here," says the neurosurgeon. The sending physician says, incredulously, "Are you sure you're looking at the right scan? It's huge and shows up on at least four slices." This would have been an opportunity to capture failure in the making. But there was no dispute about the diagnosis implied by the image because there was no further communication with the sending physician about the image. Why?

We can only speculate because the details are unavailable, but the critical issue is that the neurosurgeon clearly believed that there had been an error in diagnosis at the transferring hospital. CT images, like all clinical data, are seen in context. What exactly was the context for this case? Was there a history of receiving transfer patients with wrong diagnoses? How are such transfers usually handled? Is it normal for the sending physician to speak directly with the neurosurgeon or are there intermediaries involved? This case report implies that transfer is the end of a process and that further contact between the sending and receiving physician is not expected. Is this really the case? How are disagreements about diagnosis handled? What sort of scrutiny did the images receive when the expected subdural hematoma did not appear? Did the neurosurgeon show the image to an ER colleague and receive confirmation that the pattern was indeed more consistent with a stroke?

As we begin to inquire into the context that surrounds transfer of patients, we discover that this case is far from simple. In the final analysis, the IT intended to bridge what threatened to become a gap in the continuity of care actually contributed to it. As for countermeasures, our attention is initially drawn to the details of the computer systems and their operation. But effective defenses against future events like this one may depend on making direct physician-to-physician contacts more reliable. Improving the ability of practitioners to communicate across institutional and organizational boundaries, increasing the quality of opportunities for consultation and clarification, and reducing the disincentives to such contacts—although more difficult to achieve—are likely to make the system itself more robust.

The Future of IT

What does this case imply about the future? The Federal government insists that adding IT to hospitals will make patients safer.(3) Surprisingly, only a few studies of IT-related failures exist in the medical literature.(16,17) In part, this is due to challenges in reporting sporadic or "isolated" events when human error is readily available as a target of blame. What is clear, however, is that IT is a double-edged sword. According to the USP, human-computer interaction is already a leading cause of medication errors!(18)

Virtually all health care facilities are in the midst of adding clinical IT, often at great cost. The eagerness with which some groups have embraced IT is worrisome, and the failure rate of these projects is high. Mandates from groups such as the Leapfrog organization (19) stress the advantages of IT but do not acknowledge that IT itself is generating new forms of failure. Cases like this one demonstrate that IT in health care requires more than simply adding "computerization" to make things faster, more efficient, and safe. While the potential of having instant and easy access to patients' medical records is very attractive, creating and maintaining user-centered automation poses a major challenge.

Take-Home Points

  • Adding IT does not eliminate failure but, instead, changes the type, frequency, and severity of failures, often in unpredictable and surprising ways.
  • Creating robust IT depends on designing health care computer systems to be user-centered rather than technology-centered.
  • Although failures involving IT are often regarded as human (operator) error, these failures actually arise from poor human-computer interaction.

Richard I. Cook, MD Associate Professor, Department of Anesthesia and Critical Care Director, Cognitive Technologies Laboratory University of Chicago

References

1. Cook RI, Render M, Woods DD. Gaps in the continuity of care and progress on patient safety. BMJ. 2000;320:791-4.[ go to PubMed ]

2. Hawryluk M. Government pushes for electronic medical record standards: legislation would help fund system purchases and ensure that users could share data. American Medical News Web site. February 9, 2004. Available at: [ go to related site ]. Accessed February 4, 2005.

3. Bush GW. Executive order: Incentives for the use of health information technology and establishing the position of the national health information technology coordinator. The White House Web site. April 27, 2004. Available at: [ go to related site ]. Accessed February 4, 2005.

4. Rogers LF. PACS: radiology in the digital world. AJR Am J Roentgenol. 2001;177:499.[ go to PubMed ]

5. DICOM. The DICOM standard. Available at:[ go to related site ]. Accessed February 4, 2005.

6. Horii SC. DICOM: a nontechnical introduction to DICOM [Radiologic Society of North America Web site]. Available at: [ go to related site ]. Accessed February 4, 2005.

7. DicomWorks. Inviweb; c2000, 2005. Available at: [ go to related site ]. Accessed February 4, 2005.

8. Subdural hematoma. Medical Encyclopedia section. MedlinePlus Web site. Available at: [ go to related site ]. Accessed February 4, 2005.

9. Billings CE. Issues concerning human-centered intelligent systems: what's "human-centered" and what's the problem? Available at:[ go to related site ]. Accessed February 4, 2005.

10. Cook RI, Woods DD. Adapting to new technology in the operating room. Hum Factors. 1996;38:593-613.[ go to PubMed ]

11. Cook RI. Observations on RISKS and Risks. Communications of the ACM. 1997;40:22. Available at: [ go to related site ]. Accessed February 4, 2005.

12. Piccard D. Book review: Normal accidents by Charles Perrow. Available at: [ go to related site ]. Accessed February 4, 2005.

13. Greenfield MA. Normal accident theory: the changing face of NASA and aerospace. November 17, 1998. Available at:[ go to related site ]. Accessed February 4, 2005.

14. Skelly A. Pain pump study says patients at death risk. The Medical Post Web site. April 8, 2003. Available at:[ go to related site ]. Accessed February 4, 2005.

15. Billings CE. Aviation automation: the search for a human-centered approach. NJ: Lawrence Erlbaum; 1996.

16. Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc. 2004;11:104-12.[ go to PubMed ]

17. Patterson ES, Cook RI, Render ML. Improving patient safety by identifying side effects from introducing bar coding in medication administration. J Am Med Inform Assoc. 2002;9:540-53.[ go to PubMed ]

18. United States Pharmacopeia. Computer entry a leading cause of medication errors in U.S. health systems. December 20, 2004. Available at:[ go to related site ]. Accessed February 4, 2005.

19.The Leapfrog Group. Computer physician order entry fact sheet. Available at: [ go to related site ]. Accessed February 4, 2005.

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Related Resources From the Same Author(s)
Related Resources