Sorry, you need to enable JavaScript to visit this website.
Skip to main content

Lesson from the VA's Team Training Program

Julia Neily, RN, MS, MPH; Peter D. Mills, PhD, MS; Lisa M. Mazzia, MD; and Douglas E. Paull,MD | November 1, 2011 
View more articles from the same authors.



The Veterans Health Administration (VHA) National Center for Patient Safety (NCPS) implemented the national Medical Team Training (MTT) Program to improve surgical patient care. While team training has been a recommended strategy to improve patient safety, there were few data proving its benefits in the health care setting. Therefore, our strategy included a robust evaluation program—hoping that our experience would produce generalizable knowledge for other health care systems. In this article, we will review our (largely positive) experience and highlight some of the success factors and challenges.

The Program

After a 2-month period of preparation and planning, a day-long learning session was held for VHA facilities that provide surgical services. The operating room was closed to facilitate attendance. Using crew resource management theory drawn from aviation, staff was trained to work together as a team, challenge each other when they identified safety risks, conduct checklist-guided preoperative briefings and postoperative debriefings, and implement other communication strategies. We conducted quarterly follow-up interviews with the implementation team for 1 year to coach and assess implementation of the MTT Program.(1-2) Participants completed a Safety Attitudes Questionnaire (SAQ) (3) at baseline and follow-up. VA Surgical Quality Improvement Program (VASQIP) data was assessed for outcomes.(1) A mobile point of care high-tech simulation component was added later in the program.


We found an association between implementation of the MTT program and surgical mortality: an 18% reduction in annual mortality for the 74 sites in the training program compared with a 7% reduction for the 34 sites that had not yet been trained.(1) The trained sites also reported improved communication, increased staff awareness, and improved equipment use during surgery.(1)

Lessons Learned

There were many lessons learned from the implementation of this program.


The first is the value of leadership support.(4) The program was mandated by VHA top leadership. Planning calls were held with senior facility and surgical leaders. Among potential predictors measured at the time of the learning sessions, we found that facility leadership was the strongest predictor of future implementation of preoperative briefings and postoperative debriefings.(4)

Local champions and microsystem implementation

One component of the program that we carried over from conducting the national Breakthrough Series (5-7) was to engage the teams in extensive "preparation and planning." Rather than simply providing training, each facility participated in at least two calls to help them form a team, engage leadership support, and develop a plan for change. It was also helpful to start with early adopters and willing volunteers to help implement the changes.(8) These local champions jump-started the process for their site. The pivotal role of the Operating Room Nurse Manager in the success of the briefing and debriefing process cannot be overemphasized.(9) MTT faculty helped sites initiate changes, using the Plan-Do-Study-Act cycle, before standardizing the process.(10) We encouraged teams to start small, such as with one surgical specialty, and then to end big by implementing the program for all surgical cases. Sites were allowed to customize the briefing and debriefing tools and processes for their local environment, which helped with buy-in. Examples of briefing tools (checklists) were posted on the NCPS Web site, and teams were encouraged to adapt and modify these tools (see Figure 2 in Neily [2]). Sites could also customize the process and tools for different surgical specialties, for example eye or orthopedic cases.

While allowing this flexibility in implementation of tools, certain parts of the program were inviolable. For example, we required that all facilities conduct briefings and debriefings as part of their program. Many participants later told us that having briefings and debriefings as a required component gave them a focus for implementation. So, requiring certain key actions while remaining flexible about the way the actions were implemented was a good strategy for change.

Follow-up support

We also learned the value of follow-up support. One of our major concerns was that sites would close the operating room, train the clinicians, and then go back to "business as usual" the next day. To help prevent this, we coached teams (by conference call) through site-specific quarterly interviews. During these calls, we helped teams plan their next steps or discuss how to remove obstacles to change. This was helpful in keeping sites on track with continued implementation. In addition, sites knew we would be following them and that we had an expectation for change—this helped to move teams past their inertia to make changes at the front line. We involved site-based leadership in the program at the learning session and encouraged them to remain involved to promote continued improvement.

Peer-to-peer communication

Most professional groups can more easily take in new information from a teacher from the same profession.(8) Knowing this, we had the training delivered by a nurse and surgeon team. This modeled the idea that different professions can collaborate as a team (including working through interprofessional communication breakdowns), while at the same time using like-professions to communicate to the same professional groups in the training.


While our initial program plan did not call for the use of simulation, we were persuaded by emerging evidence that simulation was an important adjunct for teaching crew resource management.(11-13) We added it later in the program. We found that it is crucial to select the type of simulation equipment and scenarios that will most likely accomplish the learning objectives of a given MTT training mission. We utilize Gaba's dimensions of simulation applications.(11)

Adding simulation also adds complexity to team training. Providing routine maintenance and dealing with unexpected equipment failures can be complex so we cannot overestimate the value of a dedicated logistical team. Using existing simulation centers, especially at larger, university-affiliated medical centers, has provided a welcome alternative when available. Making sure the simulation scenarios align with the didactic material is important.

Another lesson we learned is the importance of measurement.(12) This measurement of the simulation is different from the overall program evaluation because it provides immediate feedback to multidisciplinary teams. Feedback domains included overall teamwork, communication, situational awareness, leadership, and decision-making. This provides some structure to the debriefing discussion that follows each and every simulation scenario. Teams take a moment to recognize their strengths and focus on areas for improvement. Preliminary results have included improvement in self-reported confidence in teamwork and communication as well as better performance measured by a validated observational tool.(14) We currently utilize a global rating scale (Clinical Teamwork Scale).(15) Survey data was captured by adapting the Self Efficacy of Teamwork Competencies Scale.(16) The combination of a survey and observational tool allows for a comprehensive evaluation of patient safety skills acquired.

The final lesson learned from MTT simulation is the importance of debriefing. MTT instructors were trained in and use the model described by Fanning and Gaba.(17) Instructors serve as facilitators to guide the discussion. Questions are largely open-ended. Our version of MTT currently utilizes an oral debriefing without video playback. There are advantages to video playback that need to be balanced against privacy and confidentiality concerns for each team training program.(17)

Evaluation of the program

Evaluation of the overall MTT program was somewhat challenging because sites were not all trained at once or randomized. Nevertheless, we chose a combination of paper and pencil surveys, telephone interviews, and surgical outcome data to evaluate the overall program. This combination allowed us to track the overall outcomes of the program nationally as well as capture local stories of patient safety improvements—including potential adverse events that were avoided because of the program. We found that a simple story of a patient who was spared having to suffer a surgical adverse event because the team conducted a briefing could be a more powerful description of the effect of the program—and motivator for adoption or further training—than any statistical analysis.

Pitfalls to avoid

As with any large training, it is very important to have your ducks in a row. Each component of the program builds on the next, so no step can be left out. It is critical to engage leadership so that everyone knows this program is a priority. Preparation and planning must be done so that the change team is ready to implement change the day after the training. There must be a proper space for training and the right support so that the trainers can teach and the students can learn; and there must be good, consistent follow through so that every team knows they have support and that there is an ongoing expectation for change. To the extent that one of these pieces was left out of the training, teams did not fare as well.


MTT is an effective tool, moving health care toward high reliability, establishing a "fair and just culture," and improving patient safety.(18) The success of MTT programs will depend on engagement of leadership, local champions, and frontline providers; preparation and planning; a patient safety improvement project with metrics and goals; interactive, interprofessional communication and teamwork training; and follow-up.(19) Our experience demonstrates that, when these ingredients are present, MTT is associated with improved staff morale and better patient outcomes.

Julia Neily, RN, MS, MPHAssociate Director, Field OfficeVHA National Center for Patient Safety

Peter D. Mills, PhD, MSDirector, Field OfficeVHA National Center for Patient Safety

Lisa M. Mazzia, MDVHA National Center for Patient Safety

Douglas E. Paull, MDDirector, Patient Safety CurriculumVHA National Center for Patient Safety



1. Neily J, Mills PD, Young-Xu Y, et al. Association between implementation of a medical team training program and surgical mortality. JAMA. 2010;304:1693-1700. [go to PubMed]

2. Neily J, Mills PD, Lee P, et al. Medical team training and coaching in the Veterans Health Administration; assessment and impact on the first 32 facilities in the programme. Qual Saf Health Care. 2010;19:360-364. [go to PubMed]

3. Sexton JB, Helmreich RL, Neilands TB, et al. The Safety Attitudes Questionnaire: psychometric properties, benchmarking data, and emerging research. BMC Health Serv Res. 2006;6:44. [go to PubMed]

4. Paull DE, Mazzia LM, Izu BS, Neily J, Mills PD, Bagian JP. Predictors of successful implementation of preoperative briefings and postoperative debriefings after medical team training. Am J Surg. 2009;198:675-678. [go to PubMed]

5. Mills PD, Weeks WB. Characteristics of successful quality improvement teams: lessons from five collaborative projects in the VHA. Jt Comm J Qual Saf. 2004;30:152-162. [go to PubMed]

6. Volicer L, Mills PD, Hurley AC, Warden V. Home care for patients with dementia. Fed Pract. 2004;21:13-28. [Available at]

7. Mills PD, Waldron J, Quigley PA, Stalhandske E, Weeks WB. Reducing falls and fall-related injuries in the VA system. J Healthc Saf Q. 2003;1:25-33.

8. Rogers EM. Diffusion of Innovations. New York, NY: The Free Press; 1995. ISBN: 9780743222099.

9. Robinson LD, Paull DE, Mazzia L, et al. The role of the operating room nurse manager in the successful implementation of preoperative briefings and postoperative debriefings in the VHA Medical Team Training Program. J Perianesth Nurs. 2010;25:302-306. [go to PubMed]

10. Langley GJ, Nolan KM, Norman CL, Provost LP, Nolan TW. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance. San Francisco, CA: Jossey-Bass Publishers; 1996. ISBN: 9780787902575.

11. Gaba DM. The future vision of simulation in health care. Qual Saf Health Care. 2004;13(suppl 1):i2-i10. [go to PubMed]

12. Rosen MA, Salas E, Wilson KA, et al. Measuring team performance in simulation-based training: adopting best practices for healthcare. Simul Healthc. 2008;3:33-41. [go to PubMed]

13. Fox-Robichaud AE, Nimmo GR. Education and simulation techniques for improving reliability of care. Curr Opin Crit Care. 2007;13:737-741. [go to PubMed]

14. Wolk S, Paull DE, Mazzia LM, et al. Team training simulation pilot for VHA surgical services. Presented at Association of VA Surgeons 35th Annual Meeting; April 2011; Irvine, CA.

15. Guise JM, Deering SH, Kanki BG, et al. Validation of a tool to measure and promote clinical teamwork. Simul Healthc. 2008;3:217-223. [go to PubMed]

16. Paige JT, Kozmenko V, Yang T, et al. High-fidelity, simulation-based, interdisciplinary operating room team training at the point of care. Surgery. 2009;145:138-146. [go to PubMed]

17. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc. 2007;2:115-125. [go to PubMed]

18. Frankel AS, Leonard MW, Denham CR. Fair and just culture, team behavior, and leadership engagement: the tools to achieve high reliability. Health Serv Res. 2006;41:1690-1709. [go to PubMed]

19. Salas E, Almeida SA, Salisbury M, et al. What are the critical success factors for team training in health care? Jt Comm J Qual Patient Saf. 2009;35:398-405. [go to PubMed]

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Related Resources From the Same Author(s)
Patient Safety Primers
Related Resources