Sorry, you need to enable JavaScript to visit this website.
Skip to main content
SPOTLIGHT CASE

Which Line: Ordering Provider or Proceduralist?

Save
Print
C. Craig Blackmore, MD, MPH | March 1, 2019
View more articles from the same authors.

Case Objectives

  • Review the role of mistake-proofing to block errors from leading to adverse events.
  • Discuss different forms of mistake-proofing: inspection, self-check, successive check, and check with control.
  • Explain the use of checklists and huddles in preventing adverse events.
  • Understand the use of clinical decision support to prevent incorrect ordering of imaging procedures.

The Case

A 58-year-old woman with multiple myeloma required placement of a central venous catheter for apheresis, a blood-straining procedure to lower the level of abnormal proteins in her blood.

In general, two types of central venous catheters may be used for apheresis, a tunneled central venous catheter and a nontunneled central venous catheter. (Tunneled catheters enter the skin and then go through a tunnel right beneath the skin before entering a large central vein. A nontunneled catheter goes directly through the skin into a large central vein [e.g., internal jugular lines and subclavian catheters].) Placing tunneled catheters involves a more specialized and high-risk procedure; however, they are associated with fewer infections than nontunneled catheters and can be used for longer periods of time.

The outpatient hematology–oncology provider ordered the procedure via computerized provider order entry. The oncologist intended to order a nontunneled catheter, as a tunneled catheter was not necessary for this indication (although it would also work). But she accidentally ordered a tunneled central catheter to be placed by the interventional radiologist.

Although interventional radiologist reviewed the order and thought it was somewhat unusual that a tunneled catheter was ordered for apheresis, she didn't contact the oncologist. The patient was consented for the procedure and a tunneled catheter was placed without complications.

When the patient presented for apheresis treatment, providers recognized that the wrong catheter had been placed. The oncologist and interventional radiologist discussed the case and decided it would be safest and most appropriate to remove the tunneled catheter and replace it with a nontunneled catheter. The error was disclosed to the patient, the tunneled catheter was removed, and the appropriate catheter was placed. There were no complications but a slight delay in initiating apheresis. Moreover, the extra procedure placed the patient at risk for procedural complications.

The Commentary

by C. Craig Blackmore, MD, MPH

Humans make errors, regardless of effort, conscientiousness, level of training, or experience. Although there is only limited data on errors in interventional radiology, errors have been well-documented in diagnostic radiology. In particular, communication errors are common, contributing to more than half of errors in one series (1) and present in up to 22% of radiology reports.(2) Communication errors occur in the ordering (as in this case), scheduling, and performance of examinations, as well as in the transmission of results. Communication errors between the ordering provider and radiology can lead to a range of adverse events, including the wrong patient or side, wrong exam or procedure (i.e., CT versus MRI), and incorrect details of how the exam or procedure is performed (i.e., type of line, use of IV contrast). Fortunately, serious adverse events are rare in diagnostic radiology, with wrong-patient events reported in 4 per 100,000 examinations (3) and wrong-side events in 8.3 to 55 per 100,000 examinations in major health care institutions in the United States.(4-6)

Because error is inevitable in human (or electronic) processes, resilient systems are designed to identify and correct errors before they lead to adverse events. The Lean management approach provides a useful taxonomy for such prevention of error propagation, termed mistake-proofing. Lean includes (in increasing order of effectiveness) inspection, self-check, successive check, and check with control.(7)

With inspection, the least effective approach, adverse events are captured after the error or issue has occurred. In manufacturing, this step happens at the end of the assembly line, with rejection of defective products. In medicine, the error is identified through inspection and corrected before leading to further harm. In the current case, the patient was harmed by having the incorrect catheter placed, but the error was eventually discovered and further injury prevented.

A better system is one where mistake-proofing captures errors before harm occurs. Under Lean, the simplest level of mistake-proofing is individual self-check. Self-check involves the worker checking his or her work for errors or accuracy. For this case, self-check would have occurred if the ordering physician had confirmed the order was for the appropriate nontunneled catheter prior to submission. The efficacy of self-check is limited by time pressure and by asymmetric knowledge—individuals in the care chain may have different understanding and expertise regarding the patient's situation. For example, a provider might order a CT to evaluate for subacute stroke, even though MRI would be better in the subacute setting as it is more accurate. Self-check will not help as the provider lacks the understanding of the preferred imaging test. A radiologist might approve the study, knowing that CT is the preferred imaging approach for acute stroke, but lacking the clinical knowledge that the suspected stroke is subacute rather than acute. Both individuals have detailed knowledge, but it is asymmetric, involving different aspects of the patient's condition and care.

A higher level of mistake-proofing is successive checks, in which workers inspect the work of the previous worker before initiating their own work. In manufacturing, successive checks (and check with control) can "stop the line" to prevent errors from becoming defects. In the current case, the interventional radiologist could have been forced to reconfirm the appropriateness of the tunneled catheter before beginning the procedure. Successive checks in medicine can also be limited by asymmetric knowledge and by the explicit or implicit medical power hierarchy—individuals of lesser status in the medical hierarchy may be reluctant to check, let alone question, the work of physicians or more senior colleagues.

Successive checks in medicine can be supported by huddles, checklists, or individual consultations. In the preprocedural huddle, all team members are empowered to speak out, decreasing the power hierarchy and lessening asymmetry in knowledge. Huddles are now endorsed by The Joint Commission as well as major interventional radiology specialty societies.(8,9) A disadvantage of the preprocedural huddle is that it requires the presence of all parties. In this case, the procedural team were not going to be involved in using the catheter, raising the possibility of asymmetric knowledge.

Procedural checklists can also support successive checks and prevent the propagation of errors.(10,11) Checklists are associated with improved communication and teamwork in operating room (12) and interventional radiology procedures.(13,14) Checklists have been shown to decrease interventional radiology errors as well as procedural delays or cancellations.(15) Because checklists only function if acted upon, they are best used as a part of huddle with group accountability and transparency. Checklists suffer from the same limitations as huddles in that they are often focused on the technical aspects of the procedure and may not incorporate information from the clinical end-user. In this case, a checklist with an item that required direct consultation (in person or virtual) between the interventional radiologist and the oncologist prior to line placement might have brought the error to light.

The most effective mistake-proofing is check with control. Control means a system that captures when an error has occurred and blocks propagation, preventing patient harm, all without active human input. In manufacturing, the "line" is effectively "stopped" automatically. A famous example from Toyota was the development of an automatic weaving loom that stopped as soon a thread broke, whether an operator was in attendance or not. In medicine, such a system would have recognized the inappropriate order and prevented it from being placed. Such systems exist in medicine but are generally available for simpler binary decisions rather than complex decisions. For example, anesthesia gas tubing has different connector configurations for different types of gases, preventing oxygen lines from attaching to nitrous oxide sources.

Clinical decision support (CDS) systems are another way to provide real-time, evidence-based information on a variety of uses in medicine, including diagnostic and intervention radiology procedures. Clinical decision support systems are algorithms that match specific clinical indications (central line for apheresis), disease processes (multiple myeloma), and patient characteristics (58-year-old woman). If the use, diagnosis, and patient type align with what the CDS has been programmed to identify as appropriate, then the order goes forward. Otherwise, the CDS can provide real-time feedback to redirect toward more appropriate care.

Clinical decision support systems can be either strictly informational, with the ordering provider able to override the system (self-check) or actually block inappropriate care (check with control). In diagnostic radiology, the use of CDS as an informational self-check tool (with ordering providers able to override potential errors uncovered by the system) has seen limited success.(16) The largest study of CDS as an informational tool in radiology, the Medicare Imaging Demonstration Project, found no change in imaging ordering after implementation.(16) In contrast, when coupled with barriers to ordering of potentially inappropriate studies (check with control), CDS is associated with substantial decreases in inappropriate imaging.(17,18) As an example, we implemented a CDS system in which orders for lumbar MRI, sinus CT, and brain MRI were only allowed to proceed in the computer order entry system under a restrictive set of indications. As a result, the 23%–27% of imaging orders without an appropriate indication were considered mistakes and blocked by the system.(17)

The published experience with CDS in interventional radiology is limited. However, one can envision a check with control system that would have prevented the incorrect ordering in this case through a series of questions for the oncologist regarding how the line would be used. Unfortunately, CDS presents significant challenges. First, CDS is less effective for dealing with high-complexity clinical decisions. The CDS system has to be programmed to consider all of the relevant clinical scenarios, meaning having a decision tree with sufficient granularity to diverge based on the intended use of the catheter; this intended use may depend on the disease entity in a specific type of patient. Further, the ordering physician would have to arrive at the appropriate decision relatively quickly through searching, mouse clicks, drop-down menus, or other functionality. Given the plethora of uses for central venous catheters, this functionality is not currently practical. In addition, the CDS system should be sensitive to practice variation and adaptable as new evidence comes on line. Finally, users of the CDS would have to have trust in the system, which would rapidly erode in the case of any algorithm-driven error.(19) In part due to all of these challenges, current CDS systems have not achieved broad acceptance and such comprehensive functionality. However, with the use of big data and natural language processing in the future, it is possible that the CDS would have identified the error (perhaps because it would have recognized that lines placed for the indication of apheresis were always of the nontunneled variety) and acted as a "control" on the system.

For the current case, self-check, successive check, or check with control all could potentially have prevented the ordering error from causing patient harm. Simple checklists used at the time of ordering for self-check might have led the oncologist to identify the error. Checklists can include questions regarding an understanding of the indication for the procedure—or in this case, for the use of the line.

Successive check, with the interventional radiologist questioning the indication and leading to consultation with the oncologist, also might have prevented the ordering error from being passed along. At our institution, we require a successive check, with the interventional radiologist reviewing all complex and high-risk procedures before scheduling. However, it is not clear that a simple successive check of this type would have helped in this case, as the interventional radiologist may have deferred to the oncologist's greater knowledge of how the catheter would be used, without direct consultation.

The most promising current solution to prevent the adverse event in this case would be a successive check through a huddle that requires the interventional radiologist as well as the oncologist. Numerous tools currently allow virtual presence of geographically separate individuals, through text at a minimum, but increasingly with visual and audio. Even the crowded schedules of practicing physicians can accommodate brief virtual huddles to prevent patient harm and reduce further downstream work. Patients can also be empowered to speak during preprocedural huddles and contribute to the team's understanding of the objectives of care. This type of successive check might well have identified the error before the adverse event occurred.

In the future, clinical decision support at the time of ordering could function as check with control. Radiology CDS is evolving rapidly, and with new deep-learning algorithms holds potential as check with control for radiology ordering.(20) However, realistically, CDS systems of sufficient sophistication do not yet exist.

In summary, resilient health care systems should expect errors and intentionally design processes with mistake-proofing to prevent these errors from becoming adverse events. In this case, a successive check, in the form of a huddle (physical or virtual) involving all relevant parties, including the interventional radiologist and the oncologist, could likely have prevented the adverse event.

Take-Home Points

  • Health care organizations must have systems in place to prevent inevitable errors from leading to adverse events.
  • Mistake-proofing can be thought of in a hierarchy of increasing effectiveness from postevent inspection, to self-check, to successive check, to check with control.
  • Huddles can serve as effective successive checks to prevent error propagation.
  • Telemedicine has the potential to improve communication and data exchange during interhospital transfer.
  • Preprocedural consultation can ameliorate the asymmetric knowledge between ordering physicians and proceduralists, helping to prevent wrong-procedure errors.

C. Craig Blackmore, MD, MPH Director, Center for Health Care Improvement Science Virginia Mason Medical Center Seattle, WA

Faculty Disclosures: Dr. Blackmore has declared that neither he, nor any immediate member of his family, has a financial arrangement or other relationship with the manufacturers of any commercial products discussed in this continuing medical education activity. In addition, the commentary does not include information regarding investigational or off-label use of pharmaceutical products or medical devices.

References

1. Siewert B, Brook OR, Hochman M, Eisenberg RL. Impact of communication errors in radiology on patient care, customer satisfaction, and work-flow efficiency. AJR Am J Roentgenology. 2016;206:573-579. [go to PubMed]

2. Quint LE, Quint DJ, Myles JD. Frequency and spectrum of errors in final radiology reports generated with automatic speech recognition technology. J Am Coll Radiol 2008;5:1196-1199. [go to PubMed]

3. Sadigh G, Loehfelm T, Applegate KE, Tridandapani S. Evaluation of near-miss wrong-patient events in radiology reports. AJR Am J Roentgenology. 2015;205:337-343. [go to PubMed]

4. Luetmer MT, Hunt CH, McDonald RJ, Bartholmai BJ, Kallmes DF. Laterality errors in radiology reports generated with and without voice recognition software: frequency and clinical significance. J Am Coll Radiol. 2013;10:538-543. [go to PubMed]

5. Sangwaiya MJ, Saini S, Blake MA, Dreyer KJ, Kalra MK. Errare humanum est: frequency of laterality errors in radiology reports. AJR Am J Roentgenology. 2009;192:W239-W244. [go to PubMed]

6. Lee YH, Yang J, Suh JS. Detection and correction of laterality errors in radiology reports. J Digit Imaging. 2015;28:412-416. [go to PubMed]

7. Shingo S. Zero Quality Control Source Inspection and the Poka-Yoke System. 1st ed. Portland, OR: Productivity Press; 1986. ISBN: 9780915299072.

8. Angle JF, Nemcek AA Jr, Cohen AM, et al; SIR Standards Division. Quality improvement guidelines for preventing wrong site, wrong procedure, and wrong person errors: application of the Joint Commission "Universal Protocol for Preventing Wrong Site, Wrong Procedure, Wrong Person Surgery" to the practice of interventional radiology. J Vasc Interv Radiol. 2009;20(suppl 7):S256-S262. [go to PubMed]

9. Rafiei P, Walser EM, Duncan JR, et al; Society of Interventional Radiology Health and Safety Committee. Society of Interventional Radiology IR Pre-Procedure Patient Safety Checklist by the Safety and Health Committee. J Vasc Interv Radiol. 2016;27:695-699. [go to PubMed]

10. Haynes AB, Weiser TG, Berry WR, et al; Safe Surgery Saves Lives Study Group. A surgical safety checklist to reduce morbidity and mortality in a global population. N Eng J Med. 2009;360:491-499. [go to PubMed]

11. Semel ME, Resch S, Hayes AB, et al. Adopting a surgical safety checklist could save money and improve the quality of care in U.S. hospitals. Health Aff (Millwood). 2010;29:1593-1599. [go to PubMed]

12. Russ S, Rout S, Sevdalis N, Moorthy K, Darzi A, Vincent C. Do safety checklists improve teamwork and communication in the operating room? A systematic review. Ann Surg. 2013;258:856-871. [go to PubMed]

13. Wong SSN, Cleverly S, Tan KT, Roche-Nagle G. Impact and culture change after the implementation of a preprocedural checklist in an interventional radiology department. J Patient Saf. 2015 Jul 31; [Epub ahead of print]. [go to PubMed]

14. Lee MJ, Fanelli F, Haage P, Hausegger K, Van Lienden KP. Patient safety in interventional radiology: a CIRSE IR checklist. Cardiovasc Intervent Radiol. 2012;35:244-246. [go to PubMed]

15. Koetser ICJ, deVries EN, van Delden OM, Smorenburg SM, Boermeester MA, van Lienden KP. A checklist to improve patient safety in interventional radiology. Cardiovasc Intervent Radiol. 2013;36:312-319. [go to PubMed]

16. Timbie JW, Hussey PS, Burgette LF, et al. Medicare Imaging Demonstration Final Evaluation: Report to Congress. Santa Monica, CA: Rand Corporation; 2014. [Available at]

17. Blackmore CC, Mecklenburg RS, Kaplan GS. Effectiveness of clinical decision support in controlling inappropriate imaging. J Am Coll Radiol. 2011;8:19-25. [go to PubMed]

18. Vartanians VM, Sistrom CL, Weilburg JB, Rosenthal DI, Thrall JH. Increasing the appropriateness of outpatient imaging: effects of a barrier to ordering low-yield examinations. Radiology. 2010;255:842-849. [go to PubMed]

19. Khorasani R, Hentel K, Darer J, et al. Ten commandments for effective clinical decision support for imaging: enabling evidence based practice to improve quality and reduce waste. AJR Am J Roentgenol. 2014;203:945-951. [go to PubMed]

20. Shortliffe EH, Sepúlveda MJ. Clinical decision support in the era of artificial intelligence. JAMA. 2018;320:2199-2200. [go to PubMed]

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Related Resources From the Same Author(s)
Related Resources