Today, more than 95% of health systems in the United States have an Electronic Medical Record (EMR) system.* This is due to the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009, which required that health systems move their paper medical records to electronic versions. The act also elevated the role of the Chief Information Officer, who at that point held the keys and the budget to massive technology investments.
For the better part of the decade, those large software investments largely stayed within the EMR realm. This was costly: investing in an EMR meant spending millions — or billions! — of dollars on a platform that could take years to implement and onboard. It also consumed a great deal of focus time on keeping patient data private, providers up to speed on how to use the platforms, and more.
Today, things are different.
When HIPAA was passed in 1996, mobile devices and search engines like Google essentially didn't exist. Software platforms didn't either, and when EMRs were adopted over a decade later, organizations simply thought that investing in this "new technology" meant they were "one and done." But now that Moore's Law has firmly taken hold of our technological development, reliance on the EMR as a sole focus on the CIO's desk — as well as the single most important piece of technology within a health system — is a losing battle. The patient journey is no longer linear, digital platforms have proliferated, and it's impossible to think that relying on one monolithic piece of technology can solve all the problems for a health system — let alone drive a better patient experience.
It's time for IT teams to break down the data silos and think outside the EMR box. Here's why.