Sorry, you need to enable JavaScript to visit this website.
Skip to main content

In Conversation with...James P. Bagian, MD

Save
Print

Dr. Robert Wachter, Editor, AHRQ WebM&M: Where did your interest in safety come from?

Dr. James Bagian: I don't know if it's really an interest in safety, quite frankly. It's an interest in trying to do things the most effective way. Safety is one of those things. If you're unsafe, that disrupts your ability to get the job done effectively and efficiently. I've been interested in that since I was a little kid. I mean at the time, I'd calculate how long it would take to drive to my grandmother's farm that was 35 miles away, and what was our average speed. My father used to always tease me about it.

My subspecialty in mechanical engineering was systems engineering. That is, how do you design systems and understand what the various influences are to get the most efficient results possible so you can achieve the desired effect and do it in the most economical way possible. My time at NASA gave me more experience both in the technical areas of how to do it, but also in leadership areas about why things work and why they sometimes don't, which was probably even more important.

RW: As you came over from NASA to the health care world, what did you find were the most transferable elements of your astronaut experience?

JB: Well, it wasn't just what I learned at NASA. Even when I was a resident and a student, I used to really chafe at the way medicine was run. I can remember being in the operating room and watching surgeons do things that made no real sense rationally or mechanically. Sometimes I was successful in interceding with the professors, explaining that's not really how to use that device, even though they'd been doing it that way forever. I would say, this isn't a matter of technique, this is a matter of how it's engineered and the metallurgical properties of stainless steel—you can't do what you're trying to do. That's why you're having trouble. Everybody learned from a particular person or an institution—which led to, you know, this is the Penn way of doing things, this is the Stanford way of doing things. Well, why are there different ways of trying to achieve the same goal? They all can't be equal. Or are they? And in many cases it's just people's personal preference, unleavened by reality.

RW: What did you learn about the issue of authority gradients, when you as an intern saw something that a senior professor was doing that seemed silly and you decided to bring it up? How was that kind of input accepted?

JB: I learned that most people were reluctant to bring things up because they would often get treated poorly. They worried it would affect their career—being accepted into a fellowship or recommended for a residency. But it didn't really stop me a whole lot. I can remember one particular case when they were applying a compression device to stabilize a knee. They had removed the knee joint, and they were having difficulty installing the device that was intended to stabilize the leg while it healed. The surgeon had already destroyed two of these devices, and there was only one left in the whole OR suite. And if this didn't work he would have to leave it as a flail knee. This would have been terrible for the patient, because they wouldn't have been able to stabilize the leg except by a plaster cast, which was really unsatisfactory. And I said, "Can I make a suggestion?" and the chief resident just looks at me and just shakes his head (the attending couldn't see him), like don't do this. Because this attending had already used quite a bit of profanity, throwing instruments around the OR—I mean a real tantrum. And the attending says to me, "Well, what do you want?" And I said, "What's happening is not that these are defective pieces of equipment, it's that you're not using them appropriately...stainless steel will always act like this. It's called galling. It essentially welds itself when you spin a K wire through it. It's not bad luck—that's the property of this material. You shouldn't do it that way." And he says, "Is that right? Well, why don't you show me?" So, I was a third year student at the time, and I said, "Well, okay," so I did. And it worked.

RW: And he didn't throw anything?

JB: No, and then after we were done he took me out to lunch. And he asked what other things had I seen, and I pointed out a number of things that I would do differently. And he ended up trying them, using them, and adopting them.

So I realized that you can do the right thing. Now was I nervous during this? Yes, but I couldn't stand to see him doing something that was going to actually end up hurting this patient, so I spoke up. But I can still remember that chief resident looking at me just shaking his head almost saying, idiot, just stay out of this. But I think that if you actually had evidence—it wasn't just opinion, you know, this is the physical laws of nature and you're trying to violate them—then you have an obligation to speak up. But medicine doesn't do that well. It still doesn't do it well.

RW: What did you hope to accomplish when you began the VA program?

JB: Well, the biggest thing was the culture. In other industries that are more reliable, it's that the culture is totally different. Having worked in health care and worked in engineering and aviation—which are polar opposites—it's not that the people in health care aren't as smart, they're probably smarter on the average. The way in which they do things is based more on personal preference, which is in stark contrast to what generally is done in aviation or space flight. It's a matter of how do you get people to understand this? My goal when I came here was to change the culture.

We did our first cultural survey back in 1998 to set the benchmark. We had a number of assumptions before the survey. One was that there would be steep authority gradients, and there were. No surprise there, but at least we had data to show it. But we found out some other things that surprised me. Most particularly, we asked a very simple question, which was 'Do you think patient safety is important to good patient care', which to me was a no-brainer. But on the one to five scale, 27% of the people said yes, absolutely important, a five. But we had 24% say one. I was flabbergasted. It shows my naiveté. Then we actually started talking to people about why would they say that patient safety was unimportant. They said that they gave it a "one" because for me it's not important because I am safe. It was this inappropriate confidence—the feeling that they were safe. It was that other physician, that other nurse, the other floor, the hospital down the street, yeah, they have problems, but I don't have problems. Well, I always say that the person who thinks it can never happen to them is the most dangerous person in the room, because they're in denial. In industries like aviation, space flight, and nuclear power, everybody knows that given the right set of circumstances it could happen to any of us.

RW: When you set up the program what was your vision? How did you think about structuring it and what did you do?

JB: There were several essential elements. First, I felt that we needed to have people who could analyze problems and come up with feasible solutions. And these solutions had to live at the frontline, not at a central place. The program would have to be easy, user friendly and yet produce tangible results. The next thing that we wanted to do was to create a culture in which people felt safe. There had to be a guarantee of some privilege of confidentiality for some of these investigations because if not, it wouldn't get results. People wouldn't tell you the truth. They just wouldn't tell you anything. And thirdly, it was to educate the next generation of health professionals.

RW: What do you think is unique about the VA, for example, electronic records or that the doctors work for VA? What lessons are generalizable for the rest of the system?

JB: I think our lessons are uniformly generalizable, I'll say that at the outset. I get asked that question all the time. Certainly the electronic medical record helps, but not for the safety so much. It makes our patients safer, but it doesn't directly impinge on what we do for patient safety. I think the reason the quality of care we deliver is so good is because it is built on the foundation of robust electronic medical record that follows you from outpatient to inpatient, etc. But that's not so much the safety thing, the way I would define it. Some of the doctors work for us, but a large number of our doctors are part time. They come over a couple days a week from the university. Even among the internal VA doctors, to think it is command and control from my lips to their ears would be wrong. It's about leadership; it's not about management style. How do you convince people what the right thing to do is? Because you want them to believe in their heart this is the way to do it. And that takes evidence, it takes stories, it takes a multi-faceted approach. It's very labor-intensive, but it pays off. It's not sending e-mails, I'll tell you that.

So, it's not that much different from doing safety work outside the VA. One of the differences that does make it easier for us is that our needs and incentives are somewhat aligned. That is, we have a set amount of money, and we have to take care of all the patients who walk in the door. So the motivation to be efficient is really important. Because if we're wasting money taking care of fewer patients that means there are other patients who aren't cared for. Whereas, in the private sector, that isn't always the case. You know, you can do bad work, and you get to redo it and get paid again. We don't get paid again. Now some HMOs are like us in that way, and other countries' health care systems run much like the VA. Denmark, Sweden, the Netherlands, the UK, they're just like we are; so it's very transportable for them. But, except for that part, even in the private sector everything else is the same, and I think the proof is that safety folks from more than 100 non-VA hospital systems in the US have come to us to be trained and have implemented our approach.

RW: In a practical sense, let's say I'm in a VA hospital in Des Moines and I'm newly hired to run the patient safety program. What are my linkages to you in Ann Arbor? How does that work?

JB: Every one of our hospitals has a patient safety manager whose job is to run the patient safety program. We structure it so that they report to their facility director. That way they're not looked at as a spy—they're part of that team. But they've got a dotted line relationship to us, and we have tremendous influence over them. We try to be reasonable with this influence. People do want to do the right thing generally; they really do, as corny as that sounds. So they know we supply them with tons of tools. That we train them inexhaustibly, that we're available by phone any time they call and people don't get put on hold forever. They know that we step in and help them, whether it's by phone or jumping on a plane and coming right to their place to help them do something if it's something that is beyond their skill set.

RW: So let's say, at my place in Des Moines, my second day on the job somebody operates on the wrong patient. How does that information make it to Ann Arbor, if it does, and what's the crosswalk there?

JB: As soon as the report comes in, they would put it into the information system for patient safety and answer seven questions. And then we have a system that prioritizes whether this rises to the level that requires further analysis.

After they go through the categorization, then the system tells them if they are required to do further action. And I must emphasize, we do root cause analysis on both close calls as well as actual events—they get the same treatment. So you can have something where no one was hurt, and it gets just as avid and thorough an analysis and action plan as if you had just killed a patient. That's a big sea change for people to think about, but it makes sense. So they'll do their report, and it comes to us. We'll look at it and then give them feedback if we find things that could enhance the value of their RCA. In addition to these efforts to support the local site, we roll it up centrally and analyze that information. That enabled us, for instance, to learn that 44% of incorrect surgeries were right-left mix-ups, versus 36% being the wrong patient. Had we not had our central patient safety information system, we never could have figured that out because no one else had that information.

I will say though—and this might sound strange—that most of the benefit to the patients for patient safety comes from what's done at the individual facility, not what we do here at the national center. We set up the structure, give them the tools, and train them. But then they do the lion's share of the work. It's only in certain events that are of broad reaching impact or of a technical nature that they don't have the expertise where we add the extra little bit. Off the top of my head, I would say that 90% of the benefit is from what's done locally; 10% is what we add. On a day-to-day basis, they're the ones doing it, and that's as it should be.

RW: I'm sure that 90% did not begin that way—you had to create the culture and the structure to do that, and that's a tremendous success story.

JB: Well, this is the most satisfying job I've ever had in my life, and I've had some good jobs.

RW: What are your overall feelings about reporting?

JB: It's not about reporting. The reports are important, but past a certain point, you already have enough to know where the problems lie. We look at the problems, and I think this is the important thing. The reports are not counting exercises—our database identifies where vulnerabilities lie. If you're really going to know the true numbers, it takes more prospective analysis and investigation. We see some cases where we have large numbers, and maybe the sites are reporting almost everything that happens. We see others where we have just a few and it may be hugely underreported. So you cannot rely on the numbers in the system to tell you the prevalence or incidence of safety problems. They only tell you the vulnerabilities, and it's for us, through our prioritization system and further analysis, to decide whether this is important and worthy of further work. The reports are just a starting point. It's the action you take based on the reports, and then showing that your actions worked—that's really where the payoff is.

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Related Resources