Because information technology has so quickly transformed people's daily lives, we tend to forget how much things have changed from the not-so-distant past. Today, millions of people around the world regularly shop online; download entire movies, books and other media onto wireless devices; bank at ATMs wherever they choose; and self-book travel while checking themselves in at airports electronically.
But there is one sector of our lives where adoption of information technology has lagged conspicuously: health care.
Some parts of the world are doing better than others in this respect. Researchers from the Commonwealth Fund recently reported that some high-income countries, including the United Kingdom, Australia and New Zealand, have made great strides in the use of electronic medical records among primary-care physicians. Indeed, in those countries, the practice is now nearly universal.
Yet some other high-income countries, such as the United States and Canada, are not keeping up. Usage of electronic medical records in America, the home of Apple and Google, stands at only 69 percent -- and most of them have little to do with patient care.
The situation in the United States is particularly glaring, given that health care accounts for a bigger share of GDP than manufacturing, retail, finance or insurance. Moreover, most health IT systems in America today are designed primarily to facilitate efficient billing, rather than efficient care, putting the business interests of hospitals and clinics ahead of the needs of doctors and patients. That is why many Americans can easily go online and check the health of their bank account but cannot check the results of their most recent lab work.
Another difference between IT in U.S. health care and other industries is "interoperability." A hospital's IT system, for instance, often cannot "talk" to others. Even hospitals that are part of the same system sometimes struggle to share patient information.
As a result, today's health IT systems act more like a frequent-flyer card designed to enforce customer loyalty to a particular hospital rather than an ATM-type card that could enable you and your doctor to access your health information whenever and wherever needed. Ordinarily, lack of interoperability is an irritating inconvenience. In a medical emergency, it can impose life-threatening delays in care.
A third way that health IT in America differs from consumer IT is usability. The design of most consumer websites is so obvious that one needs no instructions to use them. Within minutes, a 7-year-old can teach herself to play a complex game on an iPad.
But a newly hired neurosurgeon with 27 years of education may have to read a thick user manual, attend tedious classes and accept periodic tutoring from a "change champion" to master his hospital's IT system. Not surprisingly, despite its theoretical benefits, health IT has few fans among health care providers. In fact, many complain that it slows them down.
Does this mean that health IT is a waste of time and money?
Absolutely not. In 2005, colleagues of ours at the RAND Corp. projected that America could save more than $80 billion a year if health care could replicate the IT-driven productivity gains observed in other industries. The fact that the United States has not gotten there yet is not a problem of vision but of implementation.
Other industries, including banking and retail trade, struggled with IT until they got it right. The gap between what IT promised and what it delivered in the early days was so stark that experts called it the "IT productivity paradox." Once these industries figured out how to make their IT systems more efficient, interoperable and user-friendly, and then realigned their processes to leverage technology's capabilities, productivity soared.
In America, as in much of the world, health care is late to the IT game, and is experiencing these growing pains only now. But health care providers can shorten the transformation by learning from other industries.
The U.S. government is trying to help. In 2009, Congress passed the Health Information Technology for Economic and Clinical Health Act. HITECH has undeniably accelerated IT adoption, yet the problems of usability and interoperability persist.
Globally, the health IT industry should not wait to be forced by government regulators into doing a better job. Developers can boost the pace of adoption by creating more standardized systems that are easier to use, truly interoperable and that afford patients greater access to and control over their personal health data. Health care providers and hospital systems can dramatically boost the impact of health IT by re-engineering traditional practices to take full advantage of its capabilities.
The sky is the limit when it comes to potential gains from health IT. According to the Institute of Medicine, the United States wastes more than $750 billion per year on unnecessary or inefficient health care services, excessive administrative costs, high prices, medical fraud and missed opportunities for prevention. Health IT can improve health care in all of these dimensions.
The payoff will be worth it. Indeed, as with the adoption of IT elsewhere, we may soon wonder how health care could have been delivered any other way.opinion_commentary
Art Kellermann is chair in policy analysis, and Spencer Jones is an information scientist at the RAND Corp. Copyright: Project Syndicate, 2013.