One of the core challenges in patient safety is learning about our mistakes and addressing them unblinkingly. Doing so requires that we are aware when we err, and sometimes this is not easy to do. For example, while surgical errors are usually obvious, the vast majority—probably more than 95%—of medication errors are unknown. How is this possible? Patients often have several co-morbidities, and many are taking a dozen or so medications. Most medications interact in complex ways with the patients' co-morbidities, underlying conditions, other treatments, and current illnesses. Patients often become sicker despite our best efforts, and better even if we've made errors. Medications and treatments change over the course of hospitalization. Also, sometimes when physicians order exactly the right medication, dose, schedule, and route for the patient's diagnosis, the diagnosis is wrong.(1,2)
The challenge in patient safety is not only to learn from our mistakes in clinical care (prescribing, surgery, diagnosis), but also in implementing solutions and creating policies that influence the speed and shape of such implementations. In this commentary, I reflect on the interface between patient safety and health information technology (IT)—electronic medical records (EMRs) or electronic health records (EHRs), computerized provider (or physician) order entry (CPOE), the electronic medication administration record (e-MAR), and electronic prescribing (eRx)—and examine how we have failed to face our mistakes in the development, implementation, and regulation of health IT. Until we do, the vast potential for health IT to improve safety will not be reached.
Most of us believed health IT would automatically improve safety. After all, with health IT, all orders look neat and tidy. The dose, schedule, route, time, and patient are clearly specified. There are no legibility issues, no misspellings, no unwanted leading or trailing zeros, lab results are instantly sent to EHRs, medication orders are sped to the pharmacy, and warnings about drug–drug interactions appear immediately. Moreover, because we so want health IT to be that long-promised patient safety panacea, we often overlook problems that the technology enhances or obscures.(3)
While many of health IT's promised benefits are real, its problems are also very real. Lab results are often buried in screens full of irrelevant data or require lengthy scrolling to find information that should be contiguous. Drop-down menus offer irrational choices or are missing essential options. Computer decision support alerts are mostly irrelevant and annoying—with override rates as high as 97% in some systems. Order-sets often lead to less-than-thoughtful but plentiful clicking. Usability is generally lacking: each system has different ways of presenting information; sign-ons and sign-offs vary by sub-menu within the same system (i.e., "end," "finish," "submit," "quit," "done," "next," can all do very different things, some with dire consequences for patients); the same icon (say, an image of a pen) can represent seven different actions; finding a patient's information may require her full name, last name and first initial, name of the attending physician, patient record number, or even the patient's current hospital room number, etc. Physicians ignore the patient in front of them while they are obliged to check innumerable boxes on the screen to fulfill requirements that are promoted to ensure "quality care" but often distract from care quality.(4) Too many health IT systems are clunky, user-unfriendly, unnecessarily idiosyncratic, and workflow-incompatible. Physicians who voice such observations are labeled technophobic, resistant, and uncooperative.(4-6)
The essential question is: why has the promise of health IT—now 40 years old—not been achieved despite the hundreds of billions of dollars the US government and providers have spent on it? While many papers and reports have enumerated those difficulties, I suggest one fundamental but seldom voiced barrier to our reaching health IT's potential is our systematic refusal to acknowledge health IT's problems, and, most important, to learn from them. In the clinical domain, we've made progress with patient safety precisely by learning from our errors. But with health IT, we respond to disconcerting reports with denials and defensiveness; authorities and vendor associations ruthlessly attack researchers or data that do not support the syllogism of health IT equals patient safety, and more health IT equals more patient safety.(7) Paradoxically, we've learned from patient safety that accepting criticism reflects an optimism that we can do better—and we have improved. Rejecting criticism of health IT reflects a pessimism about the technology—that it is so fragile it cannot withstand even constructive suggestions.
On some level, the defensiveness is understandable.(8) Problems of patient safety are so devastating that we have freighted health IT with outsized expectations, and this has created a reluctance to acknowledge its problems. Our current policy is based on the belief that anything that encourages the purchase and use of health IT enhances patient safety and efficiency.
But our eagerness to make health IT ubiquitous has hobbled its utility by preventing us from requiring data format standards and user interfaces that would make the systems work, and work together. Each health IT system (and even some health IT systems within the same hospital unit) frequently has unique ways of displaying and recording information. The formats and interfaces are not compatible with systems across town, or even across the hallway. The promise of interoperability is thwarted by a digital tower of Babel that keeps information siloed and mutually incomprehensible. Vendors, following standard economic logic, balkanize the market to obligate providers to buy suites of products that have the potential of at least communicating within the brand; interoperability would limit each vendor's suite sales. Thus, vendors are acting perfectly rationally by resisting data format standards and interoperability; yet the result is a tragic limitation of health IT's value to any and all.
This belief that health IT, by itself, improves care and reduces costs has not only diminished government responsibility to set data format standards, it has also caused us to set aside concerns of usability, interoperability, patient safety, and data integrity (keeping data accountable and reliable). It is only within the past year or two that federal authorities have begun to tackle these issues—although many vendors are still objecting on the grounds that, for example, usability cannot be examined independent of its implementation, or that usability remains subjective, not amenable to definition or measurement, or that usability has already been achieved within their own systems, or that usability has been carefully measured within their own systems and will be retarded by regulation.(9) Thus far, these arguments, bolstered by a political environment that distrusts regulation and central control, have won the day.
There are many, additional, reasons for our current situation. I enumerate only four:
- Health IT was promoted on the basis of a survey of hospital IT needs that was profoundly (though unintentionally) biased.(10) The survey found that the barriers to health IT adoption were: technophobic physicians, funding to buy health IT, lack of incentives to use health IT, lack of certification programs, lack of evaluation data, and lack of hospital technical expertise. Analysis of the survey's questions and answer options, however, reveals the survey could only generate the findings it produced. There were no questions or answer options on the survey that focused on what appear to be the real problems of health IT, i.e., there were no questions or answer options about poor usability, lack of data standards, interoperability, data integrity, or even patient safety. While that bias was undoubtedly unintentional, it shaped policy in ways that required medical facilities to purchase health IT now, rather than promoting the creation of data infrastructures that would benefit health IT for years to come. The result is that we now find ourselves installing cumbersome systems insufficiently responsive to patients' and clinicians' needs.
- Very lumpy capital: Health IT costs hundreds of millions of dollars. A full software package from a top firm for a large hospital costs over $180 million, and can cost five times that figure for implementation, training, configuration, cross-covering of staff, and so on.(11,12) Because illness, accidents, and pregnancies cannot be scheduled around health IT training and implementation needs, the hospital must continue to operate while its core information systems are developed and installed. This investment of time and money means the hospital is committed for a decade or more. It also reduces incentives for health IT vendors to be responsive to the needs of current customers.(13,14)
- Health IT has been without any meaningful regulations since 1997 when the vendors, accompanied by many clinicians and medical informaticists, convinced the FDA that regulation would dampen creativity and limit an infant industry.(15) In economics this is called regulatory capture. Generous government subsidies and incentives (plus regulations restricting revenue to medical practices and hospitals that do not buy the equipment) further weaken the market position of health care providers when buying health IT. When government incentives and punishments (reduced reimbursements) compel health care providers to purchase health IT within the next 2 years (as is the case under the federal HITECH Act), providers can no longer negotiate purchases using threats of refusal or delay. It's a classic seller's market in which everyone must buy a product, and in which the complexity of the systems, the outsized cost and time of implementation, and the ties of legacy software make purchasing comparisons unwieldy or impossible.
- Collectively, EHRs contain bounties of information that will save lives and money. Because each system's data, however, is incompatible with those of others, our ability to combine and harvest that information is crippled. We've created a digital "locked in" syndrome when we could be swimming in knowledge already collected and coded for our use.
Forty years ago, health IT promised to make health care faster, better, safer, universally available, and more clinician-friendly. Since that time, we've achieved marketing overdrive but only halting user enthusiasm. Health IT still holds extraordinary opportunities if we confront its difficulties and honestly accept data standards, interoperability rules, and a true focus on usability and patient safety. The US faced such problems before: we had dozens of railroad gauges, hundreds of time zones, and even areas with both left- and right-hand driving rules. In all cases, the federal government established standards, and the people, the economy, and especially the resistant industries flourished. Industry claims that such standards would restrict innovation were turned on their heads.
We will have greater success with health IT when we learn from constructive criticism rather than deny or attack it. We must encourage industry to create a digital infrastructure that allows data liquidity. Failing that, we must impose a common format that permits medical research, enhances efficiency, and improves patient safety. Paradoxically, perhaps, not only will such an environment keep patients safer, I believe that it will ultimately promote health IT sales because the systems will be truly useful.
Ross Koppel, PhDProfessor, Sociology Department and School of MedicineUniversity of Pennsylvania
1. Shojania KG, Burton EC, McDonald KM, Goldman L. Changes in rates of autopsy-detected diagnostic errors over time: a systematic review. JAMA. 2003;289:2849-2856. [go to PubMed]
2. Newman-Toker DE, Pronovost PJ. Diagnostic errors—the next frontier for patient safety. JAMA. 2009;301:1060-1062. [go to PubMed]
3. Koppel R. Defending computerized physician order entry from its supporters. Am J Manag Care. 2006;12:369-370. [go to PubMed]
4. Health IT and Patient Safety: Building Safer Systems for Better Care. Committee on Patient Safety and Health Information Technology, Board on Health Care Services, Institute of Medicine. Washington, DC: National Academies Press; 2011. ISBN: 9780309221122. [Available at]
5. Cresswell K, Sheikh A. Electronic health record technology [letter]. JAMA. 2012;307:2255. [Available at]
6. Nahass TA, Nahass RG. Electronic health record technology [letter]. JAMA. 2012;307:2255. [Available at]
7. McCormick D, Bor DH, Woolhandler S, Himmelstein DU. The effect of physicians' electronic access to tests: a response to Farzad Mostashari. Health Affairs Blog; March 12, 2012. [Available at]
8. Leviss J. H.I.T. or Miss: Lessons Learned from Health Information Technology Implementations. Chicago, IL: AHIMA Press; 2010. ISBN: 9781584262404.
9. Statement of Commitment to Patient Safety and a Learning Healthcare System. Chicago, IL: Electronic Health Record Association; February 21, 2012. [Available at]
10. Jha AK, DesRoches CM, Campbell EG, et al. The use of electronic health records in U.S. hospitals. N Engl J Med. 2009;360:1628-1638. [go to PubMed]
11. Shekelle PG, Morton SC, Keeler EB. Costs and Benefits of Health Information Technology. Evidence Report/Technology Assessment No. 132. (Prepared by the Southern California Evidence-based Practice Center under Contract No. 290-02-0003). Rockville, MD: Agency for Healthcare Research and Quality; April 2006. AHRQ Publication No. 06-E006. [Available at]
12. Jones SS, Koppel R, Ridgely MS, Palen TE, Wu S, Harrison MI. Guide to Reducing Unintended Consequences of Electronic Health Records. (Prepared by RAND Corporation under Contract No. HHSA290200600017I, Task Order #5). Rockville, MD: Agency for Healthcare Research and Quality; August 2011. [Available at]
13. Koppel R, Kreda D. Health care information technology vendors' "hold harmless" clause: implications for patients and clinicians. JAMA. 2009;301:1276-1278. [go to PubMed]
14. Goodman KW, Berner ES, Dent MA, et al. Challenges in ethics, safety, best practices, and oversight regarding HIT vendors, their customers, and patients: a report of an AMIA special task force. J Am Med Inform Assoc. 2011;18:77-81. [go to PubMed]
15. Miller RA, Gardner RM. Recommendations for responsible monitoring and regulation of clinical software systems. J Am Med Inform Assoc. 1997;4:442-457. [go to PubMed]