Sorry, you need to enable JavaScript to visit this website.
Skip to main content

In Conversation with...David Marx, JD

October 1, 2007 
Save
Print

Editor's Note: An engineer and an attorney by training, David Marx, JD, is president of Outcome Engineering, a risk management firm. After a career focused on safety assessment and improvement in aviation, he has spent the last decade focusing on the interface between systems engineering, human factors, and the law. In 2001, he wrote a seminal paper describing the concept of just culture, which became a focal point for efforts to reconcile notions of "no blame" and "accountability." He has gone on to form the "Just Culture Community" to address these issues at health care institutions around the country.

Dr. Robert Wachter, Editor, AHRQ WebM&M: Is "just culture" a concept that you brought into health care, or was it a known concept in other industries?

David Marx: It was known in other industries. The civil aviation authority in New Zealand has been doing this for 10 years. Still, aviation has been on a slightly different path—early on, its entire effort was focused on how to create a safe haven to allow the reporter to come forward. But aviation has been relatively slow to embrace keeping people accountable on a day-to-day basis. Health care has always recognized the tension between holding the system accountable and holding the individual accountable. Health care was ripe to find this middle ground: how do we create a safe haven around reporting, but secondly, after knowing the risks, how do we hold people accountable to what the best practice is? For example, when hand hygiene compliance rates are just 60%, how do we get people to be compliant? There's a lot of system contribution, but there's also the issue of the individual's choices. Just culture provides some of the answers by controlling the system design and making it safe. But individuals have to come in and engage in safe practices and be accountable for their choices as they proceed through the workday.

RW: For people who aren't familiar with the just culture concept, can you briefly describe the tension between the early thrust in patient safety toward no blame with the belief that sometimes blame is appropriate? How does the just culture concept reconcile or deal with that tension?

DM: Early on, those who wore the safety hats in the patient safety or the aviation safety movement thought that people should come forward so that the precursors to errors could be better understood, in order to fix the system issues. But the tension was ultimately that individuals played a role. Even before the event occurred, they contributed through the quality of the choices that they made. So early on, it was all about getting the report. But ultimately, errors are going to occur. There are two things managers can control: the system they design around their employees, and the behavioral choices they make in that system. In the just culture model, you identify three classes of human fallibility. The human error is inadvertent—for example, inadvertently going faster than the speed limit. Once in a while, we catch ourselves going faster than we want to. That's human error. The second category is at-risk behavior—that is, taking shortcuts that ultimately lead to increased risk. It might be doing a task by memory, not confirming two patient identifiers, or not doing hand hygiene because of time constraints. People convince themselves that this is the right path to take. Can this corner be cut to get the job done? In some cases the corner is not the right one to be cut, so it's called at-risk behavior. The third behavior is reckless behavior, and that is where someone chooses to put others in harm's way. It's that notion of the drunk driver on the road who thinks he is not choosing to harm people in the minivan next to him, but who is putting them at risk by being intoxicated on the road. That's reckless behavior. While recklessness in health care is rare, it does occur. But is a punitive response appropriate for reckless behavior? Our answer is yes. If you go into any hospital and ask which of these three classes of human fallibility is the biggest contributor to harm or potential harm in the clinical area, staff and hospitals will not say it is reckless behavior. Although the IOM Report implied that error was the major contributor to harm, nurse managers and charge nurses will say that at-risk behavior is the biggest contributor. It's nurses who think they don't need two patient identifiers because they know who their patient is. And it's that at-risk behavior where it's not only about the system—it's also about our propensity as humans to drift. We have to help individuals through good coaching and mentoring, and also through system design. But ultimately, there must be an accountability system that doesn't allow someone to stay in that system if they choose to put patients in an unsafe place.

RW: So, these are subtle distinctions. Let's take a couple of examples of what might be at-risk behavior and what might be reckless behavior. For instance, if I don't wash my hands before seeing a patient because I'm too busy—but then I keep failing to do it, even though I've read the literature and know it's putting my patients at risk. Or, if I, as a surgeon, don't believe in doing a time out because I've never had a wrong site surgery, and then something bad happens. Or if a nurse doesn't do a two patient identifier once or twice because he or she is too busy, but then keeps omitting it despite counseling. How do we draw these lines?

DM: That's a great question because they are subtle distinctions. The easy way to make them not subtle is to make it clear that if someone knowingly violates the rule, there will be repercussions. But the reality is that some policy violations are going to be the right thing to do. There has to be that exception to say that the rules were not meant for every circumstance. You have to support noncompliance in some cases. Health care is more complex than any industry I've worked in, from nuclear power to shuttle systems to aviation. In this complexity, there are overlapping demands and choices that have to be made. So there's that nurse or pharmacist or physician making choices that they perceive to be in the best interest of the patient. Some of those choices are going to be knowing violations of policy. But they're going to perceive that this is the right path to take. Let's go back to hand hygiene. If a physician walks into the OR and chooses not to scrub in, or breaks the sterile field, everybody would universally say that's reckless behavior. You just don't tolerate that. But the physician who is going patient to patient doing rounds after surgery probably has a compliance rate close to 50% when it comes to CDC hand hygiene standards. Hand hygiene noncompliance is accepted because the culture has accepted it. The privileges of the physician won't be pulled. The nurse won't even be coached around the behavior. And all of a sudden you have this embedded norm that, to people on the outside, looks like reckless behavior. The same thing is happening with time outs. Initially, we said time outs are really important. Everyone has to comply, and not to do so would be considered reckless behavior. What has happened, I believe, is that we're seeing some drift on time outs. Compliance with time outs is going down. The issue is, does the organization take a stand and say that this is an unacceptable and reckless breach, or does the organization begin to tolerate it, which ultimately makes it at risk because it turns a blind eye to noncompliance? It's not easy. What we say is that you cannot proclaim what reckless behavior is. Organizations must educate the professionals in the workforce to understand the risks, and ultimately that local culture deems an activity to be reckless. You see the same thing on the road, where speeding turns into reckless driving. And societies to some extent define these boundaries.

RW: You seem to be implying that an individual hospital could make the decision that the failure to do a time out will or will not be deemed reckless. Yet, there's a role of larger units, whether it's the state or the federal government or the Joint Commission. Just on that example, where do you draw the line?

DM: I think a hospital—or even a state or a collection of hospitals—could say that not doing a time out is in fact a reckless choice. Now, the issue with that is they have to follow up. They cannot just say it's reckless and then turn a blind eye. That hospital needs to come back and stand behind that because they cannot tolerate a reckless group or a reckless individual working on patients. So I think it can be done, but the hospital has to stand behind it because the employees will call their bluff. If the hospital comes out with a letter saying that this is really important but then doesn't take action, it will drift right back to at-risk behavior.

RW: In terms of enforcing these rules against reckless behaviors, when do you draw a line and say there will be action? How have you seen that play out in terms of uniform versus disparate enforcement for doctors versus nurses?

DM: The doctor versus nurse issue—we have a gap analysis tool that looks at perceptions of staff. One of those scenarios is hand hygiene noncompliance. In one case it's a nurse, and in the other it's a doctor. And we find radically different views in the hospital about what accountability looks like. In health care, there's a very strong physician versus nurse bias—with a different sense of accountability, and a very strong severity bias—a feeling that if there has been no harm, then there will be no action taken with the doctor. In contrast, in our just culture model, if a person is reckless, whether they caused harm or not, the action should be the same. Because what we're doing is holding people accountable for the risks that they choose to take, not the outcome. That is the substantial shift in culture that we're talking about. You'll even hear from some of the Minnesota hospitals that the culture has been really "no harm, no foul." If a patient has been harmed, we're looking for the person last to touch the patient. We're going to go after that person. If a person caused no harm, there may be a tendency to turn a blind eye. But in the just culture model, reckless behavior is reckless behavior whether it caused harm or not. And that's the big shift an organization goes through: to not look at how bad it was and who was to blame, but to judge people on the quality of their choices.

RW: That, of course, is incredibly important because many of the harms are relatively rare, and you can do a lot of surgeries without a time out and not do a wrong site surgery.

DM: You'll meet a surgeon who will argue that he or she has never had an event. But a person's choices dictate the likelihood of the future event. And it's these choices that we're going to hold people accountable for. Nobody is suggesting that the surgeon is accountable for the harm to the patient. That's an organizational accountability. But the surgeon is accountable for the quality of choices he or she makes, irrespective of any future event.

RW: We began in patient safety with this no blame concept. And then over the first few years we began to recognize that it left out this critical notion of accountability. It seems to me that we're flipping very strongly toward accountability and, like most complex issues, there's a pendulum there and it tends to go back and forth until it settles somewhere in the middle. Where do you think the pendulum will settle out?

DM: For a long time, we've had perceptions of the culture being overly punitive. And you're right—it did swing to this notion of blame free. I think we're seeing from health care regulators and hospitals a pretty rapid movement to find that center ground, because we know where some of the risks are. It's about really dealing with the system issues, but individuals have to be accountable for their choices. Now, that being said, states that have gone down this path always worry that if they have a big event, like a wrong site surgery, the tendency of media, the prosecutors, and perhaps even governmental leaders is to find somebody to blame. And the issue is how are we going to resist that? The good news is that the Joint Commission is supportive of this path around just culture. AHRQ is supportive. State departments of health and credentialing boards are supportive of it. So we're gaining enough momentum that it hopefully won't be derailed by a big event where the press and parts of society are clamoring to, as James Reason says, put a carcass up on the wall to show that we've done something. It's tenuous, though, because we really need to get more and more people in the system supporting just culture so that it doesn't get derailed by the after effects of a big event.

RW: Could you comment on the malpractice system and how its way of judging errors and bad outcomes plays into efforts to create a just culture?

DM: If you participate in sports, which is a high-risk system, if you make a mistake and break the ankle of the second baseman sliding into second base, the person who gets harmed has to deal with it. It's the cost of doing business to participate in the high-risk endeavor of sport, and it's a reason that most of us carry first-party insurance to protect us from unanticipated injury. But if the player is reckless, the sport says that you have a cause of action against that person; that is, you should pay for my broken ankle because you were reckless. Health care has an analogy to sport, in the sense that it is filled with fallible human beings and is a very high-risk endeavor. Physicians are going to make mistakes; they are fallible human beings. But it hasn't changed the tort system, which says, if you get harmed by the medical system, you sue. I think the underlying logic is flawed. There are high-risk industries and, to a certain extent, there's an assumption of risk. When I come into that system, I know there's a risk of a mistake. Another example is workers' compensation systems—you don't sue your employer, but there is a system to deal with the remedy if you're hurt on the job. Some physician leaders, in particular, believe that the just culture model will ultimately lead to tort reform. I think the answer is that we have to believe that health care institutions are going to produce bad outcomes. They're going to be fallible, and we need a social insurance system that doesn't rely on having to sue your physician. In New Zealand today, you don't sue your physician—you go to the Accident Compensation Scheme if there's a medical error. But if you believe the doctor was reckless, you go to the regulator, and say, there's a reckless doctor, deal with him. But if the doctor was merely fallible, that doesn't lead to a lawsuit.

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Related Resources