Sorry, you need to enable JavaScript to visit this website.
Skip to main content

The Context Is the Intervention

Dr. John Øvretveit | October 1, 2011 
View more articles from the same authors.
Save
Print

Perspective

Introduction

What we say, do, and feel are facts. We live and work in groups, in a society, and are influenced by this context. We do not do what we are told to do, for many different reasons. These observations lie at the center of the social sciences and have implications for patient safety—understanding how and why adverse events occur, and how to change behavior and social systems to prevent them. The theories and methods used by these sciences are not well understood in medical research and health care—indeed, their status as sciences is questioned by some. This short piece aims to give a flavor of what they can contribute.

Consider rapid response teams (RRTs). The challenge is not to find the resources to staff the team or to organize work schedules so that they can respond to a call. The challenge is getting and sustaining the calls: persuading people and their leaders that they can go "outside the tribe" when patients' needs require an urgent response that they cannot give at the time. A faster improvement may come from good training about clearly detectable signs of patient deterioration that can then be dealt with without activating an RRT. Social sciences have illuminated these aspects of the RRT, which were not revealed by traditional experimental medical research methods.(1)

Other ways of seeing

Social sciences bring to patient safety a way of seeing the influences over what we do and think, about which we are not aware—influences outside of our bodies, but equally, if not more, powerful at times. In addition to theoretical perspectives, the social sciences also bring methods for systematically collecting and analyzing data about peoples' perceptions, feeling, and behavior—the facts with which those trying to bring about change have to deal.

There are examples of the contribution these theories and methods have made to understanding the incidence and severity of adverse events and patients' experience, and also the effect on caregivers.(2) Taking responsibility for an unsafe act causes guilt and shame, and psychological mechanisms, often unconscious, come into play to protect from feelings of pain.(3) Causing injury, however inadvertently, could be more difficult for health professionals to accept than for staff in other industries, where reducing suffering is not the aim or the primary source of meaning for employees. Many of us entered our profession or organizations with a wish to help others and relieve needless suffering. We often continue the stressful work in difficult circumstances and with less pay than we could get elsewhere because we believe we are helping people. Learning that we are harming patients is not only demotivating, but a fundamental threat to identity and self image, causing pain and disorientation.

Modern approaches to understanding the system causes of error tend to reduce individual responsibility for causing error, which might decrease the psychological toll on providers, but can also suggest that it is someone else's task to deal with the issues. This can be problematic if it is unclear whose responsibility it is to address complex system issues. Multi-factor or multi-level frameworks are used in research and in workplace analysis of accidents. However, the cultural, psychological, and social factors in these frameworks are poorly understood and appreciated in health care, and social sciences has much to contribute to elucidate the contribution of these factors to different adverse events in different settings.

Where all are above average

Psychodynamic and cognitive psychology both provide useful ways of understanding selective perceptions in patient safety events. Empirical studies show self-bias when individuals are deeply engaged in an activity, when they feel responsible for the outcome, and when they are visible in their activity.(4-6) Self-serving bias can operate in contradictory ways. First, individuals have been found to overestimate their contribution to a positive outcome (e.g., good patient recovery is due to our excellent care, poor recovery is due to something else ["attribution error"]). Research finds most people think they are above average in driving ability (7), even after accidents, and in performance at work.(8,9) Yet individuals also overestimate their contributions to negative outcomes. Most health personnel untrained in patient safety theories often assume avoidable patient injuries are due to their own or others' actions, rather than a combination of complex factors (termed "merit" or "just deserts" attribution).

Psychological processes are influenced by the immediate work and peer group, as well as by the wider organizational societies and cultures of a health facility. Studies are showing how social context can make adverse events more likely, and its role in the implementation of safety solutions. Researchers have discovered how different aspects of the context of patient care affect behavior and thinking related to patient safety.(10) The aspects of context first to be studied are those that are visible and more easily measured, such as physical layout, equipment design, and the task structure.(11) Apart from studies of safety culture, there have been fewer studies of the contribution of the social context to patient safety, and less understanding of its relative importance, in part because of the measurement and research challenges.

The power of social situations over individual actions is perhaps most clearly demonstrated in two landmark simulation studies, both of which yielded important insights but appear ethically questionable by today's way of thinking. Haney and colleagues (12) set out to investigate how personality was related to behavior in a prison simulation study. After six days the experiment was stopped early. The "prison guards" were committing sadistic acts and putting the lives of the "prisoners" in danger. The findings were that the interpretation of roles, the social situation, and a legitimating ideology led to obedience in the "prisoners" and brought out certain behaviors in the "guards" which they later regretted. The second was the famous Milgram (13) "obedience to authority" experiment. Ordinary people, placed in the role of an "experimenter," were instructed to administer electric shocks to a "victim" (an actor) on the other side of a glass screen. The overwhelming majority continued to "follow orders" and increase the size of the shocks despite the increasing screams of agony of the "victim." These two studies show that even conscientious and ethical individuals are profoundly influenced by others and by social expectations, sometimes without being aware of this influence. Many similar studies have confirmed this and other predispositions and influences, such as groupthink and biases in cognition.

One of the first studies in health care investigating the interaction between individual psychological factors and organizational arrangements was a study of nursing in a London hospital. The study showed how it was possible that organizational routines that distanced nurses from experiencing the everyday pain of their patients served an important function, but also dehumanized care to some extent.(14,15)

Professionals' training and socialization instill a high sense of individual responsibility and belief in competence. Work cultures also often support individual pride and sense of self-worth, both of which are related to resourcefulness and ability to make quick decisions. Tucker and Edmondson (16) found that failures such as missing information, people, or supplies, or defective equipment, lead to the common practice of using quick fixes or work-arounds rather than reporting failures, and that these practices themselves increase risks for patients.

From "intervention implementation" to "assisted inno-volution" for spreading effective safety solutions

Social sciences tend to be better on diagnosis and critical analysis than on solutions. In part this is because of a recognition that generalizations are more difficult, and recipe prescriptions for change will not have the impact of surgery or drugs: what gets decided and how change is carried through depends. "Depends on what?" you ask. This is the central question of some social science safety research.

Current thinking is that, at a broad level of generality, there are common factors that are important for most types of safety changes: leadership, resources, belief in the solution, and measurement. But the specifics of, for example, which leaders and which actions, are likely to depend on the type of change, the setting, and the wider context. The next frontier may lie in grouping safety changes in terms of the aspects of contexts that are most helpful for successful implementation and improved outcomes. Much of this investigation will require qualitative and mixed method approaches, informed by social science theories. Some safety changes, especially those that are more like concepts than specific prescriptions, are better understood as evolving innovations, where the idea is translated to a particular setting and the change agents regularly assess progress using independent data, and revise the change accordingly. For example, elements of the checklist in the central line infections prevention program in Michigan were fixed, but some could be modified to suit the local ICU. Some research is moving into the arena occupied by quality improvement project teams, and provides change agents with data for their continual adaption. These collaborative research approaches are able to study the "inno-volution" (17) as it unfolds in interaction with its environment.

Conclusion

"The context is the intervention" is a way of jolting the reader into seeing beyond traditional medical and research categories. It helps us appreciate the boundaries between the change and the surrounding influences as permeable, and that a change needs fertile preconditions and an environment for success. "All models are wrong, but some are useful," applies also to scientific perspectives and research methods. Adverse events and changes to improve safety can be understood using psychological, social, economic, cultural, and political perspectives, and using methods that have proven their worth in these and other social science disciplines has real value in patient safety. We cannot afford not to use these approaches to speed the discovery and use of safety solutions.

Dr. John ØvretveitDirector of Research, Professor of Health Improvement, Implementation and EvaluationMedical Management Centre, The Karolinska Institutet, Stockholm

References

 

1. Øvretveit J, Suffoletto J. Improving rapid response systems: progress, issues, and future directions. Jt Comm J Qual Patient Saf. 2007;33:512-519. [go to PubMed]

2. Scott SD, Hirschinger LE, Cox KR, et al. Caring for our own: deploying a systemwide second victim rapid response team. Jt Comm J Qual Patient Saf. 2010;36:233-240. [go to PubMed]

3. Hyde P, Thomas AB. Organisational defences revisited: systems and contexts. J Managerial Psychol. 2002;17:408-421. [Available at]

4. Weary G. Self-serving biases in the attribution process: a re-examination of the fact for fiction question. J Pers Soc Psychol. 1978;36:56-71. [Available at]

5. Weary G. Examination of affect and egotism as mediators of bias in causal attributions. J Pers Soc Psychol. 1980;38:348-357.

6. Weary G, Harvey J, Schwieger P, Olson CT, Perloff R, Pritchard S. Self-presentation and the modernization of self-serving attributional biases. Soc Cogn. 1982;1:140-159.

7. Guerin B. What do people think about the risks of driving? Implications for traffic safety interventions. J Appl Soc Psychol. 1994;24:994-1021. [Available at]

8. Brenner SN, Molander EA. Is the ethics of business changing? Harv Bus Rev. 1977;55:57-71.

9. Heady B, Wearing A. The sense of relative superiority: central to well-being. Soc Indic Res. 1988;20:497-516.

10. Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ. 1998;316:1154-1157. [go to PubMed]

11. Gosbee J. Human factors engineering and patient safety. Qual Saf Health Care. 2002;11:352-354. [go to PubMed]

12. Haney C, Banks W, Zimbardo P. Interpersonal dynamics in a simulated prison. Int J Criminol Penol. 1973;1:69-97.

13. Milgram S. Behavioural study of obedience. J Abnorm Soc Psychol. 1963;67:371-378. [Available at]

14. Lyth IM. Containing Anxiety in Institutions. Portland, OR: Free Associations Books; 1988. ISBN: 9781853430015.

15. McDonald R. Everything you wanted to know about anxiety but were afraid to ask. J Health Serv Res Policy. 2008;13:249-250. [go to PubMed]

16. Tucker A, Edmondson A. Why hospitals don't learn from failures: organizational and psychological dynamics that inhibit system change. Calif Manage Rev. 2003;45:55-72.

17. Øvretveit JC, Shekelle PG, Dy SM, et al. How does context affect interventions to improve patient safety? An assessment of evidence from studies of five patient safety practices and proposals for research. BMJ Qual Saf. 2011;20:604-610. [go to PubMed]

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Sections
Related Resources From the Same Author(s)
Related Resources