Editor's note: Dr. Hollnagel is Senior Professor of Patient Safety at the University of Jönköping (Sweden) as well as Visiting Professorial Fellow at Macquarie University in Sydney (Australia). We spoke with him about his work studying safety in health care and the differences between designing safety improvements in health care versus other industries.
Dr. Robert M. Wachter: What led you to get interested in safety in health care?
Erik Hollnagel: I've worked with safety-related problems in a wide variety of industries since the late 1980s. When I was a Professor of Computer Science in Sweden, I was contacted by people in the university hospital that were interested in upgrading their risk and safety training, and that's how I started to get into it.
RW: When you migrated into looking at health care from other industries, what struck you as different and what struck you as similar to how other industries think about safety?
EH: What is similar is that they think that somebody else is doing it better than they are and have the solutions, but all industries think that. Everybody tends to look at what they do themselves and see all the problems, and they look at what others do and only see all the nice things. And they're all mistaken, of course. What's characteristic of health care—it's complicated, unpredictable, and very difficult to regulate or standardize. More so than other fields. Health care likes to compare itself with aviation, but I think there's very little to compare the two. It's more like air traffic management than aviation, actually. But for some reason they haven't looked very much at air traffic management.
Health care is under enormous public, political, and economic pressure. Other industries are not challenged in the same way that health care is. It's also under enormous political pressure because citizens are concerned about their health. And other industries don't have the public eye on them in the same way health care does.
RW: You might think that would lead health care to apply additional attention and resources and time to safety. Yet, when you compare the emphasis on safety and health care versus in other industries, what is the relative emphasis on safety in health care?
EH: I think it's difficult and to some extent unfair to compare because the other industries are completely different by their very nature. They are far more regular, and events are more recurrent and uniform than they are in health care. What characterizes the problems in health care is that the professionals who work there are under so many different types of pressures that they feel that they have to respond. But they don't have time to really reflect on what they should do, so sometimes they do things that perhaps were not the smartest things to do. But they have to consider so many things and satisfy so many stakeholders at the same time—it's nearly impossible not to do it.
RW: As opposed to the life of a pilot or a nuclear power plant observer?
EH: The work of a nuclear power plant operator is often characterized as 99% boredom and 1% panic. Health care is perhaps not the opposite, but it's certainly a lot less boredom and a lot more panic.
RW: It does strike me that part of it is the complexity, part of it is the time pressure, part of it is that, while you're trying to do this complicated thing safely, you're also doing customer interactions in real time. It feels pretty different than the plight of the pilot behind the locked door or the nuclear power plant controller.
EH: It's also the rate by which technological solutions have been adopted in health care—I'm particularly thinking about IT. With IT—when the developer promises the earth and never delivers, it just makes it even more complicated. Because if the IT system doesn't work, then it's patched, and the patches are imperfect and inaccurate and that just creates even more problems that require even more patches and so on. Whereas, if you look to IT systems in other domains—car manufacturing, robotics, air traffic management, or any kind of industry, even banking (which in some sense has an environment that's just as chaotic)—they have less pressure to innovate all the time than health care.
RW: Other industries have standard business pressures to innovate.
EH: Business pressures and certain safety concerns, customer satisfaction concerns, and so on. But they're more stable and you can take a longer perspective there. Very few other industries have politicians interfering in the same way that health care has.
RW: In the last 10 years, health care has gone from a paper-based industry to an electronic health record–based industry. What does that do to the overall challenge of improving safety?
EH: It shouldn't necessarily affect safety, but it changes the working conditions for the safety professionals in ways that have not been anticipated, either by the system developers or by the people in management who purchase and install these systems. That means people face even more complexity—and in my view, unnecessary complexity.
RW: When I sit on the Patient Safety Committee and we hear about cases, it strikes me that to change anything now almost always involves a change in the way the electronic health record either works or interacts with humans. Now you have to loop in a part of the bureaucracy that was not involved when you were just dealing with paper. In the old days, the solution to a safety problem was much more about process or communication. Now the electronic health record provides the opportunity to hardwire a fix. But it means you have to get a whole IT department involved, and that adds a level of complexity that we didn't have.
EH: Not only do you need to have a whole IT department involved, but it also affects things in ways that people don't anticipate. One case, which was a fairly big scandal here in Denmark, was that the health care management purchased a very expensive system and thought, as a result of that, doctors were going to dictate their records of interviews with patients, and therefore you could save on the secretaries. So they started by firing nearly all the secretaries. Of course, the system didn't work as it should, and it didn't improve efficiency. About 2 years later, they had to hire back many secretaries to get it working. They were promised too much by developers who really don't understand how health care is done in practice. They sell standardized systems that don't really fit the particular hospital, and it creates problems that people have to run around trying to solve.
RW: Talk about resilience engineering. What do you mean by that term?
EH: The idea of resilience engineering, and we have something called resilient health care as well, which is applying the principles to health care settings, is that we need to understand whatever happens regardless of the outcome. It happens in basically the same way. In other words, one of my colleagues, David Woods, said that "failures are the flip side of successes." The basic idea is that even when something goes wrong, people were trying to do what they thought was right at the moment, based on how they understood the situation and the information they had available and time and resources and so on. So we shouldn't look for failures, but we should look at how things go well and try to understand how things go well. On the basis of that, we can better understand these situations where things don't go well.
RW: Can you give an example in health care where you've seen that approach applied and it has led to a real improvement?
EH: We see that applied in many ways. In most cases, we're not looking at things that fail in the sense there's an accident or an incident or an awful outcome. We're trying to understand and improve how things work in the operating room and in the ward. Some studies in Australia have looked at nurses who have equipment and computers on trolleys that they're supposed to bring everywhere. On the other hand, they cannot take them into rooms of patients in sterile conditions, so they had to find workarounds. The important thing is to see how they develop these workarounds to see how you could better manage the system and possibly change or make improvements so it would be easier for them to do their work. You cannot just insist that they have to bring the trolley in, because in some cases they simply know they cannot. In fact, sometimes there are contradictory regulations.
RW: So, resilience in health care is looking at the work-as-done and then seeing how people do workarounds and how you learn from those experiences and observations.
EH: Yes, you used the terms that we quite often used and introduced in the beginning. They seem to be very useful. Work-as-imagined and work-as-done. How we think work should be done and how it's actually done. You cannot manage something based on what you think is going on. You actually need to know what happens on the floor. In health care in particular that is very important, because of the many patients and the many interests. Many of the people in charge of managing health care from a distance don't have a good understanding of what's going on. Even though their efforts to manage are made with every good intention and trying to do things well, you cannot really manage it from a distance.
RW: As you've looked at organizations in health care that are good at this, are there common traits or common strategies? Is it executive walkarounds, where they get out on the floor? Is it embedding an executive on the floor? Is it training the people to look at their work through a new lens, whether it's with Lean or something else? How do you get better at this?
EH: It's not as simple as putting an executive on the floor once every fortnight or so. The important thing is to make people realize that there are things that they don't know and things that they take for granted. And maybe every now and then, they should consider whether what they take for granted actually fits with what happens. That goes for all levels. The idea about work-as-imagined and work-as-done is that we all know how our own work is done and we all imagine how other people work, whether it's from the top-down or bottom-up. In each case, we need to put some effort into trying to reconcile that and check whether our assumptions are appropriate. And for that, you need to exchange information. You need to be able to talk to each other. We can do it by helping people talk about their work, not talk about what has gone wrong but simply talk about the work and share their experiences. We've just completed a pilot study in Sweden [to be presented at the forthcoming Resilient Health Care meeting in Japan in August 2019], which seemed to work very well. And that's something they do in other industries as well. So the ideas of resilience in health care is of course applicable across all industries.
RW: In the organization you worked with in Sweden or in others like it, have you been successful in making the argument that time needs to be allocated from all the members of the team to have these conversations? I can imagine an organization stressed for resources finds it difficult to allocate this time where there's not an obvious return-on-investment and people are waiting in the emergency room to come upstairs.
EH: Sure. But the point is that the return-on-investment is not immediate. The return is in the longer term. In Sweden, people were allowed to meet every day briefly and talk, and we helped them with the structured way of doing it and a structured way of recording the conversations. There's a return on that, but it's not measurable for the next 2 weeks or the next 2 months. But in the longer term it will be there because people suddenly understand what they do, they understand their colleagues better, they're able to work better together, and therefore they become more efficient. And I'm sure patients also become healthier because of that.
RW: As you've been looking at health care for many years and other industries for even longer, have you seen trends that are either hopeful or make you pessimistic in terms of how things have changed?
EH: Well, I'm pretty sure if I was pessimistic, I would have done something else. The trends unfortunately are not happy trends; the trends are this onslaught of technology and apps and worst of all the Internet of Things and developments like that. Not that that in itself is bad. What is bad is that people become overenthusiastic about that and buy it and apply it without realizing the potential drawbacks or even all the potential advantages. They're too quick to apply.
We have a principle called efficiency–thoroughness trade-off or ETTO, which means that people in everyday work situations usually feel that they have to trade off thoroughness in order to be efficient. But it also works the other way around. So it's the thoroughness–efficiency trade-off. The argument is that in order to be efficient later on, you have to be thorough now. You have to spend time now thinking about what you're going to do later, otherwise you won't be able to do it efficiently.
I'm still not pessimistic, but what I'm worried about is that people do not spend enough time thinking about the consequences of the changes and improvements that they propose and promote and introduce. They all do it with the best of intentions, but sometimes they should reflect a little more on their intentions. Regrettably, I don't see an easy way out of that except talking to people and explaining to them that it's important to pay attention to what you do and understand what you do. I mean in the sense of Safety-II—understanding what you do when things go well—rather than in the sense of Safety-I, finding the causes for what went wrong.
What makes me a little optimistic is that people actually listen to and adopt that and are willing to try it out in practice. I'm working with a couple of organizations that are trying to do this in practice. The problems are not easily solved because the problems are not of our making; the problems are in the way that the industrialized societies develop at a breakneck speed. It's not enough to say hold your horses, but we need to explain to people that it pays to be a little patient and think about what you do. My experience, and that's why I'm optimistic about it, is that people understand that and they're willing to do it.