A few years into the widespread adoption of electronic health records (EHRs), health data is starting to mount. But the data is often incomplete, and access is limited. If system leaders use what’s readily available, they’ll likely be comparing bits and pieces, apples and oranges.
In its current state, “meaningful” isn’t the most accurate word to describe it.
Recognizing that the time has come to harness big data to make changes that will ultimately improve patient care, the National Quality Forum released a white paper this summer taking a look at where the industry stands, where it would like to go, and suggested steps toward getting there.
“Data to measure progress is fundamental to improving care provided to patients and their outcomes, but the healthcare industry has yet to fully capture the value of big data to engineer large-scale change,” says Christine K. Cassel, MD, president and CEO of National Quality Forum. “This report outlines critical strategies to help make data more accessible and useful, for meaningful system-wide improvement.”
A lack of data isn’t the problem. As electronic health records gain traction, the industry is gathering oodles of information. Unfortunately, much of it is scattered among different sources, and with those sources, variables occur.
“A patient’s data are often fragmented across multiple electronic health records, depending on their insurance and the providers they visits,” the white paper explains, “and limited interoperability among those record systems prevents many providers from having a full picture of a patient’s healthcare.”
What’s more, depending on which source you look at (say, electronic health records vs. claims records), the data often conflict with one another. Collected and input by humans, the data is affected by varying documentation and coding practices, which end up altering the performance measures the data seek to represent. This erodes trust in the data we currently have available. And without trust in its relevancy, the data won’t hold much power to bring about action or change.
The healthcare industry would like to have access to accurate, relevant, and timely data so it can analyze it and implement system changes to improve patient care. The white paper includes a smaller-scale but noteworthy example of how the sterile processing facility at Seattle-based Virginia Mason Medical Center used data it collected manually to identify defects and their causes, improve its processes, and slash its defect rate from 3 percent to 0.12 percent.
Central to meaningful use of data are healthcare experts skilled in interpreting and applying it toward system improvement. Ultimately, the industry would like to have a workforce with the skills to analyze and implement data-inspired approaches to better patient care. Buy-in by organizational leaders as well as frontline clinicians will be crucial for improvements to last.
The white paper provides more than a dozen steps EHR vendors, healthcare organizations and others can take to help transition the industry from “here” to “there.”
EHR vendors can help by promoting true interoperability among their systems, for starters, and also by doing away with high, recurring fees for data access. Insurers, including Medicaid and Medicare, could help by making their data more broadly available and in a more timely manner, the white paper recommends, and also by contributing to databases that integrate data across populations and payers.
The white paper directs the most recommendations toward healthcare organizations, which they charge with four challenges:
The journey will take cooperation and intention, but the end result of improving patient care is a worthy one—and central to the industry’s mission. As the white paper notes as one of its key themes, “Healthcare needs to improve, and it can.”