Sorry, you need to enable JavaScript to visit this website.
Skip to main content

In Conversation with…Pat Croskerry, MD, PhD

June 1, 2010 
Save
Print

Editor's note: Pat Croskerry, MD, PhD, is a professor in emergency medicine at Dalhousie University in Halifax, Nova Scotia, Canada. Trained as an experimental psychologist, Dr. Croskerry went on to become an emergency medicine physician, and found himself surprised by the relatively scant amount of attention given to cognitive errors. He has gone on to become one of the world's foremost experts in safety in emergency medicine and in diagnostic errors. We spoke to him about both.

Dr. Robert Wachter, Editor, AHRQ WebM&M: What got you interested in patient safety?

Dr. Pat Croskerry: The simple answer is that I really wasn't aware of the issue until I became the head of an emergency department. It says something about the covert nature of error in medicine that I really wasn't aware of what was going on in the department until suddenly everything started coming across my desk. What struck me was that I never ran into a case where a mistake was malevolent or egregious in any way. The errors always involved well-intentioned efforts by hard-working people, but these people were working with imperfect systems and flawed cognition.

Thinking critically and clearly, especially in an environment like emergency medicine, is not an easy thing to do, even at the best of times. An emergency department—which some people have described as a natural laboratory for error—is a chaotic environment. Once I became aware of things that were going wrong, I began to look outside of medicine and to other industries, like the airline industry and to the people who had been talking about human error, for answers.

RW: You ended up focusing on cognitive errors and diagnostic errors. What about you and your background caused that to happen?

PC: Before I went into medicine, I trained as a psychologist. Not a cognitive psychologist, but an experimental psychologist. I started seeing these repeated thinking errors that very hard-working people were making. With the help of the cognitive psychology literature, I was able to put together various explanations for how people actually got into trouble with their thinking. I want to emphasize that I don't think anybody was being casual or sloppy in their thinking. It's just that they were vulnerable to biases and distortions of their reasoning, especially in emergency medicine, which is a fertile ground for things going wrong. In the end, I decided to spend my time thinking about how doctors think.

RW: You mentioned the complexity and pace of the emergency department. Are there other attributes of emergency departments or the practice of emergency medicine that make thinking about patient safety different?

PC: I think so. The primary problem is the level of uncertainty. If you look at diagnostic error for example, it's highest in the disciplines with the most uncertainty, which are emergency medicine, family practice, and internal medicine. By the time you get to an orthopedics clinic or a plastics clinic, a lot of the uncertainty has been removed. In the emergency department, you usually don't know the patient, you don't necessarily have access to their whole history, and for all intents and purposes, the patients are strangers—it's quite a challenge. And on top of that, we're interrupted and distracted on a regular basis. We often cannot predict what the flow will be like; there are lots of surges—suddenly you have six ambulances at the door—and, I'm afraid, very few ebbs. I think about when I go into a bank and stand in an orderly line and each teller has one person to deal with. Bank tellers wouldn't dream of trying to deal with about eight or ten customers at one time. But that is what we do in the emergency department. It's like spinning plates: You juggle up to about a dozen patients at one time.

Psychologists tell us that when your attention is distracted by interruptions, you have to refocus on something else and then you have to get back to where you were before, and that's very costly in terms of cognitive effort. Add to that the problem that most emergency departments operate around the clock, adding the complications of fatigue and sleep deprivation. And it's now fairly clear that in the last 3 or 4 hours of the night shift, the emergency physician is probably functioning at about 70% of his or her capability. So when you add all those things up, you realize that to expect high-quality decision-making is somewhat unrealistic.

RW: One of the attributes of emergency medicine is how complex and undifferentiated patients are. So take me through the brain of Pat Croskerry when a patient with chest pain comes to see you, versus a physician who hasn't thought about the cognitive aspects of decision-making and diagnostic errors. What's going through your brain that's not going through that person's brain?

PC: I have the benefit of having analyzed a number of cases that went wrong and so I'm aware of the cognitive biases. They're not just thinking errors, they're also affective errors—errors that arise when physicians' feelings or nurses' feelings get involved in the decision-making process. And you can watch them happen: You can just stand back sometimes and admire the cognitive choreography in the emergency department.

RW: I love that.

PC: Just the way that people get set up for errors. For example, if a nurse or a colleague comes to you and says, "Oh, so and so is here. She's always here. It never amounts to anything; she's just a frequent flyer." For me, those are red flags. If anybody is offering me a diagnostic opinion without a thorough history and examination of the patient, then I immediately discount that in my thinking. I try to follow some of the recommendations in the psychology literature about how to avoid these cognitive traps. Most of us were trained on prototypes, for example, this is what chest pain looks like. But in fact, typical presentations of chest pain are in the minority. The majority of patients who come in with an acute coronary syndrome won't be typical. If you start from the position that you're looking for aberrant presentations, or if you're aware of patients labeling themselves, or of colleagues, triage nurses, and even paramedics labeling patients, then I think you've got your guard up. That's the difference. I make mistakes just like the next guy, but hopefully I'm making fewer because I see them coming. But I don't want to sound superior. I'm very humble about the whole setting and one's vulnerability in it.

RW: I was interested in the notion of affective errors. So when you're having a bad day or you're angry with a patient or you're overwhelmed, how do you defend against that? Does it help that you're aware of the possible holes that you may fall into, or are there more specific strategies that you undertake to prevent errors from flowing from those different affects?

PC: I think awareness is number one. Physicians tend to think of themselves as cold, objective decision-makers and we know that isn't so. If you take, for example, a borderline patient or a patient who's being obstructive, they create a negative atmosphere. The psychology literature tells us that hot emotions—emotions made when you're in a state of visceral arousal—are dangerous. If I find myself becoming emotionally polarized toward the patient, there are certain strategies that I follow to try to defuse that situation. One thing you can do is just take a time out and excuse yourself and say "I just have to attend to something else and I'll be right back." Then take a moment of reflection and identify your emotional arousal and try to get past it. The critical thing for me is to provide the best care here and not to allow my emotions to intrude. Now again, the psychology literature says that, not just in medicine but throughout your life, no decision is made that doesn't have some emotional polarization in it. If the patient is arousing negative emotions in you, then your decisions won't be as good as they would be otherwise.

By the same token, but to a lesser extent, you can get into similar trouble when your emotions are positively aroused. Some work has been done on this, but if you feel very positively toward the patient, sometimes there is a covert avoidance of finding the stuff that you don't want to know about. What I'm suggesting is that physicians would do well to develop skills in emotional intelligence.

RW: You've talked mostly about things that are going on between your ears that might help prevent certain errors in charged situations with a lot of uncertainty. Systematic solutions have also been proposed, such as computerized decision support or others. What's your sense of the utility of those kinds of approaches?

PC: Well, my starting point is that we will take whatever help we can get. A number of initiatives have been proposed that help us in our decision-making. Decision-making, which arguably is the most important skill that a physician has, breaks down into two types of reasoning. We reason intuitively—the fast, reflective shoot-from-the-hip stuff that all of us do, and as you get older, you do it more. That's in contrast to a slower analytical, deductive method that's much more precise and often yields fewer errors. Given that we spend most of our time in that fast intuitive mode, in emergency medicine at least, then the answer to the problem would be how do you make people function better in the intuitive mode? I was delighted to see a paper by Gordon Schiff and David Bates published in the New England Journal of Medicine about improving electronic documentation to avoid diagnostic error. They listed a dozen or so features that might help people improve their performance; interestingly, they match up very well with what the literature says. There is an excellent book by [Robin M.] Hogarth called Educating Intuition, and he makes exactly these points. It's been shown very clearly that better environments make for better decisions. If you improve the feedback that people receive and if you have systems that prompt and remind you along with checklists and so on, there's a variety of strategies aimed at improving one's performance in that intuitive mode. So for my money, I think that approach needs encouraging.

RW: What will the practice of emergency medicine look like in 5 or 10 years, particularly vis-à-vis computer systems and decision support?

PC: Well, it's very clear that computerized decision support is a good idea, but it hasn't had a very good track record. Work on this started about 40 years ago. To some extent, the problem is partly the overconfidence of physicians, who think that they can outperform computers, and a lot of the time, they probably cannot. People are really challenged when they have to make more and more decisions. But if a computer interface that—let's say you were distracted or didn't have time to take everything into consideration—notices something that you didn't take into consideration, then it says, "You are about to discharge a patient with an elevated heart rate." Those little prompts are often enough to jolt you out of that intuitive mode, take a moment of reflection, and perhaps make a better decision. The medication information in computerized order entry systems is excellent. If you plug in the wrong dose or the dose is too high for somebody in renal failure, it lets you know right away, and that's the cognitive support we need. The more you can provide software that functions more reliably than your own brain, then the better the position you're in.

And it's not cheating. Some physicians think that algorithms and clinical decision rules decrease their autonomy, and it's a way of escaping some of your responsibility. It isn't. The literature shows very clearly that those decision rules and algorithms will outperform or match the performance of the physician about 90% of the time. Yet the uptake of clinical decision rules is abysmally low. People don't like the interference with their autonomy, and the patient in front of them is always special, and so on. But at the end of the day, the clinical decision rule will outperform you, so why not use it? Why not relieve some of the cognitive load? Say I've got a patient with a suspected TIA [transient ischemic attack]—if I can default into an algorithm that says the best management of TIA is this, this, and this, then that's where I go, because I know that those clinical decision rules have been developed by well-rested, well-fed people in the cold light of day who've looked at huge populations of patients. And they will outperform my decision-making, especially in the environment in which I'm working.

RW: Can you give us a couple of examples of things that you built into your emergency departments that reflect your interest in safety and that you're particularly proud of?

PC: The major thing that we did was to change the nature of our M&M rounds. When I first inherited the department, we would have people presenting cases on their diagnostic triumphs or on some interesting esoteric case. But we weren't looking critically at what we were doing. So we turned our M&M rounds around, focusing on cognitive errors, affective errors, biases, distortions of reasoning, and so on. When I first came into my department, we were not doing that. It's been helped by the patient safety movement of course. But there is now an openness and an honesty in the way that people will review their cases. That was one of our major gains.

The other one was that we really put a concerted effort into improving feedback. To have a system operating without feedback, as we often do in emergency departments, complex patients just disappear in the ICU or disappear into the morgue and you haven't really learned anything. We implemented a number of strategies that have significantly improved our feedback.

RW: In both of those circumstances, part of the theme is getting your colleagues and yourself to be comfortable learning about and hearing about your failures unblinkingly. How did you get the culture to accept that?

PC: It wasn't easy. We were inheriting a very long tradition in medicine of secrecy. Diagnostic acumen, for example, is the one thing that physicians hold very dear. It's the most important thing to them. To actually stand up and say "I got this wrong" takes a bit of guts. The way that that works best is if you can get senior physician leaders to stand up and admit that they've made mistakes and show by example that it's okay to say you're not perfect. I certainly did that and I didn't suffer by it. The department generally became more honest and more willing to discuss our shortcomings. At the same time, remind people that we have to deal with a level of uncertainty that can never allow us to become perfect decision-makers. There's always a huge residue of uncertainty that we must learn to live with. When nurses ask me the discharge diagnosis on the patient and I say it's "chest pain not yet diagnosed" or "abdominal pain not yet diagnosed," I did not make the diagnosis; I didn't reach that final point. But I've left it open so that other people won't come in and possibly inherit the wrong diagnosis.

RW: That takes a lot of courage.

PC: You must have a bit of a thick skin to start that. But once you do it, then other people will follow, until eventually, it becomes the departmental standard. I go to M&M rounds in other departments and see how they conduct them. Some of them are still suffering from that secrecy and covering things up. And especially with the younger people, you can work with them to bring it out. Say what you were thinking; say what you think you were doing wrong. If anything, physicians have a tendency to be overcritical of themselves. Then you can say, look, if you understand the error process and the biases and the obstacles put in your way, then you can feel better about some bad decisions that you did make.

RW: Any other recommendations you would make to emergency medicine physicians or people managing emergency departments?

PC: When I came into emergency medicine, nobody sat me down and said, this is what your life is going to be like. I think that's good in any area of medicine. To be frank with people and let them know what's coming and the sorts of failures that they're going to experience and so on, I think it just makes things more realistic. In my career, I've had a number of close colleagues who left emergency medicine because they couldn't live with the consequences of what they perceived to be a mistake on their part. So my advice to anybody coming in would be to talk to some of the older physicians and ask them what it's going to be like. Hopefully, they will get a realistic appraisal and they won't see it as a place where they must demonstrate perfection.

RW: Anything else that you want to talk about that we didn't cover?

PC: I do think that the whole business of shift work is devastating to people, and we have to find better ways of scheduling people in emergency departments. We've developed a system here called casino shifts, where we actually change over at 3:00 in the morning. That sounds counterintuitive, but if you set this up right, you will actually improve people's longevity in the discipline. The number one reason given by physicians for leaving emergency medicine is the shift work. It's extremely difficult to make good decisions in the last 3 or 4 hours of the night shift. We need to do more work on sorting that out. There are ways of identifying people who are different prototypes. You know, you have morning people and evening people, and as you get older you tend to become more of a morning person. So you can set up individual shift scheduling, which optimizes the physiological capabilities of age and physiological chronotype. I think that's extremely important. Shift work is a necessary evil, but it's an evil that we can dilute to some extent, and get more out of people and make them happier.

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Related Resources