Aviation Safety Methods: Quickly Adopted but Questions Remain
On August 2, 2005, Air France flight 358 crashed while landing in Toronto. In less than 2 minutes, the crew evacuated 309 passengers. Several minutes later, the plane burst into flames.(1) Crashes like this are remarkably rare, yet the crew was prepared to perform the lifesaving task of evacuating more than 300 panicking individuals faster than I could ever evacuate 30 patients from the waiting room in my clinical office. This almost miraculous effort by the crew illustrates the stark contrast between the safety of other industries and the safety of health care.
As health care has taken note of aviation's safety record, we have also found similarities between the two industries that suggest that we can apply aviation safety methods. Both industries are comprised of highly trained professionals working in teams that use technology to manage hazardous processes where risk varies dramatically from moment to moment.(2) In addition, the stereotypical, if outdated, image of pilots and physicians is similar: confident and hard-working experts able to act in the heat of the moment to save lives. However, the health care system is more complex than aviation.(3) For example, more professionals are involved in health care than aviation (pharmacists, physicians, different types of nurses, physical therapists, respiratory therapists, and more), and they often train and practice in their own professional "silos," thereby making communication and cooperation challenging. These professionals also interact with a greater variety of devices than in aviation, and the object of their work, the human body, is more complex than an airplane. Finally, regulation of health care is more fragmented than aviation: there are no health care agencies comparable to the National Transportation Safety Board, which investigates accidents, or the Federal Aviation Administration, which regulates airlines and pilots.
Despite these differences, the seductions of superficial similarities and quick solutions, coupled with the widespread pressure on health care to improve safety, have led health care to apply several aviation safety practices. In most cases, the application of these methods leads to questions about whether and how they improve patient safety. Here, I highlight the use of three widely adopted aviation safety methods in hospitals: non-punitive reporting of errors and adverse events, measuring and improving teamwork based upon concepts from Crew Resource Management (CRM), and use of surveys (often referred to as safety climate or safety culture surveys) to measure attitudes about teamwork, error reporting, organizational leadership, and other issues relevant to safety.
Aviation reporting systems have been heralded as examples for health care.(4) For example, the Aviation Safety Reporting System (ASRS) is a voluntary, confidential incident reporting system used to identify hazards. Pilots report incidents (any occurrence that could affect safety), and ASRS staff analyze them, may interview reporters, and issue alerts to the industry. ASRS operates independently of the Federal Aviation Administration (FAA) and has no regulatory role. The Aviation Safety Action Program (ASAP) differs from ASRS in that it is airline-based; the airline knows if the events are related to their own operations.
Over the last 6 years, health care reporting systems have proliferated, and some have become well established.(5) Individual hospitals have improved their systems and now emphasize reporting. Some states encourage, or even require, reporting, and some adverse events must be reported to the Joint Commission on Accreditation of Healthcare Organizations (JCAHO). On July 29, 2005, federal legislation was enacted to encourage the reporting of medical errors to Patient Safety Organizations and to protect information in the reports from use in lawsuits.(6)
Although health care organizations (mostly hospitals) have adopted reporting, in contrast to ASRS, the diverse systems rarely communicate with one another. On one hand, this heterogeneity may impede information sharing and learning, but it also may result in reporting systems that are more responsive and relevant to the local providers, patients, and organizations (similar in this way to the ASAP). For example, investigators in our University of Texas Center for Patient Safety have collaborated with the Texas State Board of Medical Examiners and three hospitals to create a nursing reporting system modeled after ASAP.
Although there are accounts of how error reports have facilitated the identification and correction of system errors, no studies clearly demonstrate that reporting systems reduce errors and adverse events. In addition to this lack of evidence, there are also questions regarding how reporting systems should be incorporated into an organization's quality and safety activities and how much emphasis they should receive. There are also more subtle, but equally important, questions. Some hospitals ask frontline providers to report near misses, but what do they perceive as a near miss? For example, some providers see near misses as signs of system breakdowns, but others see near misses as signs that the system is working.(7) And whereas some reporting systems struggle to get providers to report, others struggle with too many reports and cannot figure out how to analyze and prioritize information.(3)
The concept of teamwork is like motherhood and apple pie: everyone is in favor of it. However, what is good teamwork? How do we measure it? And what, exactly, should we be telling providers to do? Many readers will think these questions unnecessary as they look to aviation's CRM program (8) and the broader literature on teamwork.(9) And of course some CRM-based interventions have already been studied.(10) However, certain team behaviors may be useful for only certain medical processes and not others. For example, in aviation an important aspect of teamwork is giving junior team members the authority to question decisions of senior team members. However, the aviation team does not typically include trainees—everyone in the cockpit is a pilot. In health care, the junior team member may be a medical student. Under what circumstances and how is it useful for a medical student to question a surgeon's decision in the operating room? Understanding these issues is important because team training will be very resource intensive if it is to change behavior and improve safety. One-day courses may be useful for teaching basic concepts, but it is hard to imagine how they could permanently change behavior.
Safety culture surveys, most of which have been adapted from Helmreich's work in aviation (11,12), are also being widely endorsed and used.(13) For example, more than 100,000 providers in hundreds of hospitals and several countries have completed the surveys developed by our group.(14) The psychometric characteristics of the UT safety climate survey have been independently studied.(15) We have found it useful for measuring the effects of interventions such as executive walk rounds.(16) It remains to be seen if such surveys predict other measures of safety, such as errors and adverse events. Another limitation is that surveys only measure attitudes—one small component of an organization's culture. They cannot measure behavioral norms, values, or competencies. Fortunately, the well-established science for survey evaluation can help address some of these issues, and surveys are relatively easy to implement (especially compared with the other safety methods). Without understanding and improving provider attitudes, health care will not be able to implement (or at least will not see the full benefit of) more complicated interventions like reporting systems and team training.
Many other methods, most notably simulation (17), are being adapted from aviation. The use of CRM-based team training in simulators may be effective where the health care setting or processes most resemble a cockpit environment: the operating room or emergent resuscitations (eg, trauma, neonatal, or adult cardiopulmonary resuscitation). In these settings, a relatively small and well-defined group of providers come together to perform a specific and time-limited task, in contrast to more complicated areas like ambulatory care or a general inpatient ward.
In some respects, these methods drawn from aviation are analogous to the basic sciences of safety and quality.(18) We know they are important and might well be profitably "translocated" (19) to health care, but a great deal of research is needed to understand their mechanisms and determine how to effectively apply them so that health care may approach the level of safety seen in aviation.
Eric J. Thomas, MD, MPH
Associate Professor of Medicine
University of Texas Houston Medical School
Dr. Thomas is supported by Agency for Healthcare Research and Quality (AHRQ) grant number 1PO1HS1154401.
ReferencesBack to Top
1.Air France jet overran runway. CBS News. August 5, 2005. Available at: http://www.cbsnews.com/stories/2005/08/05/world/main760728.shtml. Accessed December 5, 2005.
2.Thomas EJ, Helmreich RH. Will airline safety models work in medicine? In: Rosenthal MM, Sutcliffe KM, eds. Medical Error: What Do We Know? What Do We Do? San Francisco, CA: Jossey-Bass; 2002: 217-234.
4.Kohn LT, Corrigan JM, Donaldson MS, eds. To Err is Human: Building a Safer Health System. Washington DC: National Academy Press; 1999.
6.Patient Safety and Quality Improvement Act of 2005, Pub L No. 109-41, 119 Stat 424. Available at: http://frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=109_cong_public_laws&docid=f:publ041.109. Accessed December 5, 2005.
7.Tamuz M, Thomas EJ. Danger in the guise of safety: near miss as a system failure? A success? Or a good nurse at work? Presented at: 2005 Academy of Management Annual Meeting; August 9, 2005; Honolulu, HI.
8.Helmreich RL, Foushee HC. Why crew resource management: empirical and theoretical bases of human factors training in aviation. In: Wiener EL, Kanki BG, Helmreich RL, eds. Cockpit Resource Management. San Diego, CA: Academic Press; 1993.
9.Brannick MT, Salas E, Prince C, eds. Team Performance Assessment and Measurement: Theory, Methods, and Applications. Mahwah, NJ: Lawrence Erlbaum Associates; 1997.
10. Morey JC, Simon R, Jay GD, et al. Error reduction and performance improvement in the emergency department through formal teamwork training: evaluation results of the MedTeams project. Health Serv Res.2002;37:1553-1581. [ go to PubMed ]
11. Helmreich RL, Merritt AC, Sherman PJ, Gregorich SE, Wiener EL. The Flight Management Attitudes Questionnaire (FMAQ). Austin, TX: The University of Texas; 1993. NASA/UT/FAA Technical Report 93-4.
12. Helmreich RL, Merritt AC. Culture at Work in Aviation and Medicine: National, Organizational, and Professional Influences. Aldershot, UK: Ashgate; 1998.
13. Sorra JS, Nieva VF. Hospital Survey on Patient Safety Culture (prepared by Westat, under Contract No. 290-96-0004). Rockville, MD: Agency for Healthcare Research and Quality; September 2004. AHRQ Publication No. 04-0041.
14. Sexton JB, Thomas EJ, Helmreich RL, et al. Frontline assessments of healthcare culture: Safety Attitudes Questionnaire norms and psychometric properties. Technical Report 04-01. The University of Texas Center of Excellence for Patient Safety Research and Practice. AHRQ Grant # 1PO1HS1154401.
Available at: http://www.uth.tmc.edu/schools/med/imed/patient_safety/SAQ_Norms_and_Psychometric_Properties_for_Website.pdf. Accessed December 13, 2005.
16. Thomas EJ, Sexton JB, Neilands TB, Frankel A, Helmreich RL. The effect of executive walk rounds on nurse safety climate attitudes. A randomized trial of clinical units. BMC Health Serv Res. 2005;5:28. [ go to PubMed ]
19. Wachter RM. Playing well with others: "translocational research" in patient safety [Perspective]. AHRQ WebM&M [serial online]. September 2005. Available at: http://www.webmm.ahrq.gov/perspective.aspx?perspectiveID=9. Accessed December 5, 2005.