by Xiaoming Zeng, MD, PhD; Rebecca Reynolds, EdD, RHIA; and Marcia Sharp, MBA, RHIA
Health information technology (HIT) is being sought as one of the key elements to streamline the process of providing healthcare to improve quality and harness cost. It is hoped that HIT will lead to a more cost-efficient healthcare system than the current one. Surprisingly, there is no agreed definition of HIT in academic literature or government documentation. The Health Information Technology for Economic and Clinical Health (HITECH) Act (a provision of the American Recovery and Reinvestment Act of 2009 (ARRA) defines health information technology as “hardware, software, integrated technologies or related licenses, intellectual property, upgrades, or packaged solutions sold as services that are designed for or support the use by health care entities or patients for the electronic creation, maintenance, access, or exchange of health information.” It could refer to a broad base of information technologies used in healthcare from robotics surgery to chronic disease home monitoring devices.1 However, there is a consensus on the purpose of HIT as the use of devices for the management of information in order to ensure that it is available to the right person at the right time and place.2–4 HIT is the basis for a more patient-centered and evidence-based medicine with the real-time availability of high-quality information.5, 6 Despite the various interpretations of the scope of HIT, all healthcare stakeholders agree that it is the premise on which a 21st-century healthcare system in the United States must be based.7 HIT experts concur that the U.S healthcare system must widely adopt interoperable electronic health records (EHRs) with important components such as computerized physician/provider order entry (CPOE) and e-prescriptions to build a cost-efficient healthcare