Improving the Quality of Quality Measures: Pilots Show Promising Results from Clinically Enhanced Administrative Data

by Gina Rollins


EHRs will improve the accuracy of quality measurement...one day. Until then, blending limited clinical data with administrative data may offer a solution.


Administrative data based on discharge abstracts or claims have widely acknowledged limitations, but they remain the primary means for reporting quality metrics and calculating risk adjustment and pay-for-performance rates. Clinical data, which hold the key to the best measures, are still largely locked up in paper records.

Electronic health records will free up clinical data, offering the possibility of new reporting and measurement paradigms, and a few organizations have begun using clinical parameters from the systems for internal and external quality reporting. Even so, the implementation of EHRs nationally remains very much a work in progress, and quality reporting initiatives are not waiting for the systems to catch up. One improvement in the short term is the potential to enhance administrative data with limited clinical data.

Reporting the Data at Hand

The Agency for Healthcare Research and Quality (AHRQ) is currently sponsoring several research and pilot projects expected to illuminate methods for using clinically enhanced administrative data to improve quality reporting and risk adjustments. At the same time, the National Quality Forum (NQF) has just begun a process to identify and endorse ambulatory care measures suitable for both public accountability and quality improvement that can be derived primarily from clinically enhanced administrative data.

These initiatives, and the experiences of organizations that already have begun incorporating EHR-based clinical information in their performance reporting, underscore the potential of these data as well as the effort involved in employing it.

“You have to go through quite a long process to have all the data pulled and validated,” explains Jacqueline Matthews, MS, RN, director of quality at the Cleveland Clinic. “Just because the computer pulls out data and loads it into an abstraction program doesn’t mean that’s all that’s involved. It has to have human validation to make sure you’re capturing the right thing.”

The Cleveland Clinic used discrete fields from its EHR surgical history and physical module to pilot quality reporting as part of the National Surgical Quality Improvement Program. It is using that experience to develop EHR-based reporting of Medicare core measures for heart attack, heart failure, pneumonia, and surgical infection prevention, a process that is in the early stages. “It will probably be one year before we have clean data,” Matthews says.

The limitations of administrative data in publicly reporting quality measures, calculating risk adjustment, and determining pay-for-performance rates have been recognized for some time. Among other things, data from discharge abstracts or claims alone do not distinguish between comorbidities and complications. They limit the number of secondary diagnoses that can be reported, may not fully utilize ICD-9-CM codes because of existing coding regulations, and do not provide information from laboratory values that are important in calculating risk.

Even so, administrative data continue to be the basis of performance assessment for multiple payers and quality review organizations. As a result, providers have legitimate concerns that they have been and continue to be penalized for caring for sicker patients simply because of inadequate reporting mechanisms. Additionally, this reporting—with slightly different measures, definitions, and reporting periods for each payer—requires substantial resources.

EHRs promise better processes and more precise information for providers, payers, and quality review organizations alike. Individual providers may have the opportunity to identify and report directly from EHR databases discrete values of consequence, including comorbidities, complications, and diagnostic testing values. Meanwhile payers, regulators and patient safety organizations ultimately may be able to develop improved risk adjustment and performance measures that rely less on administrative data. Despite these purported benefits, the systems remain an elusive goal for many organizations.

Bridging the Gap with POA, Lab Data

To bridge the gap between the complete EHR systems and the patchwork of disparate paper and electronic systems in place today, AHRQ sponsored key research in assessing the incremental value of adding certain clinical elements to administrative data. It also awarded three pilot grants to explore the feasibility of implementing data models based on the agency’s research and a planning grant to assess interest in and perceived challenges and opportunities associated with establishing clinically enhanced administrative databases. AHRQ plans to share the results of the projects to assist the industry in improving quality reporting and risk adjustment without adding undo administrative burdens for providers before there is nationwide EHR implementation.

“We wanted to explore what the most efficient way would be to enhance the value of administrative data used to report quality,” says Anne Elixhauser, PhD, senior research scientist at AHRQ. “Administrative data is used more and more for measuring quality and public reporting, but there are significant limitations to the data for those purposes. Our goal with these grants is to establish the feasibility of linking clinical and administrative data and developing an approach that can be exported to other sites.”

In a series of studies, AHRQ-sponsored researchers tested incrementally more complex models of risk adjustment for five medical conditions and three surgical procedures using data from the Pennsylvania Health Care Cost Containment Council. The models included one standard model based on claims data available today, three that incorporated increasing levels of present on admission (POA) data, and two that included numerical laboratory data and POA modifiers with different levels of simulated coding completeness.

The researchers used the mean c-statistic of each model to assess how well it discriminated in predicting the outcome in question—mortality. They established a scale of 0.5 for a random guess and 1.0 for a perfect discernment.

They found that pure administrative data provided “reasonably good” discrimination at 0.791, according to Elixhauser. Adding POA codes with up to 24 secondary diagnoses improved discrimination to 0.841. Incorporating more complete ICD-9-CM coding and numerical laboratory data improved discrimination to 0.864. The most complex model, which amounted to a full clinical abstract of each record, boosted discrimination to 0.881.

“We found that the biggest additions came from [POA] and lab data. A full clinical abstract didn’t add that much to discrimination,” explains Elixhauser. “This showed relatively modest incremental additions of clinical information to administrative data was almost as good as a full clinical abstract of the medical record, yet it would cost considerably more to achieve that level of discrimination.”

Pilot Sites Try It Out

Based on these findings AHRQ awarded three two-year pilot grants in September 2007 to the Florida Agency for Health Care Administration, Virginia Health Information, and the Minnesota Hospital Administration. The organizations were to add hospital clinical information to administrative data and transmit the data for quality reporting purposes. An additional planning grant was awarded to the Washington State Department of Health.

The pilot projects have had enthusiastic involvement even though participation requires considerable extra effort. “These are very busy people, and the organizations are not being paid to do it, but they’re involved because they think it will make things better in the long run,” says Michael Lundberg, executive director of Virginia Health Information, which works with public agencies and private organizations to report healthcare data in the state.

The Virginia project involves 30 hospitals representing about half of annual discharges in the state. As of January 2009, participants had submitted three-quarters of POA data and half of laboratory data using Logical Observation Identifier Names and Codes (LOINC) in addition to traditional administrative data, which the participants have submitted for years to Virginia Health Information for quality reporting purposes. The process has been a learning experience for all involved.

“Even though POA data was required by CMS as of fourth quarter 2007, all hospitals may not have done it the same way, so we had to standardize the process,” explains Lundberg. Also, there was additional effort in coding POA data for more than just Medicare patients.

Similarly, determining the tests, values, and reporting format of laboratory results was “one of our greatest challenges,” Lundberg explains. Participants had to agree upon the test results that would be reported as well as a normal range for each test, no small feat considering that each facility might have different names for the same test and that there might be multiple subgroups under certain tests. The agreed-upon tests then had to be associated with LOINC codes and put in the appropriate format. “LOINC has been around for years, but it’s not necessarily used by all hospitals, and the process of linking to it was easier for some than others,” Lundberg notes.

Hard Work, Promising Results

As a trade-off for their efforts, Virginia Health Information expects to provide participants with reports evaluating each hospital’s comparative risks of death and postoperative complications and risk-adjusted outcomes rates with and without enhancement of its administrative database. “The POA requirement is so new, no one really knows what the correct rate should be, so this will be an opportunity for them to understand their patient population and assess the quality of both their data and their care,” Lundberg says.

Similarly, the Florida Agency for Health Care Administration (AHCA) plans to provide a quality report of hospital-specific clinically enhanced data to each participant, along with aggregate data across all participating hospitals. Each hospital also will have its raw data returned for internal quality improvement purposes, according to Christopher Sullivan, PhD, administrator of the office of health information technology.

There are 22 hospitals, three hospital systems, and two pediatric facilities participating in the initiative. Hospitals have submitted POA and other administrative data for three quarters and are preparing to submit lab values.

Hospitals will submit POA, lab, and other administrative data to AHCA via secure file transfer protocol. AHCA staff in turn will load the clinical lab data into the database that already holds administrative data submitted by the hospitals. The clinical and administrative data sets then will be joined into a single file for each hospital, combining the clinical and administrative data. This combined file will then be sent securely to a contractor, which will group the merged clinical and administrative data into APR DRGs for analysis, according to Sullivan.

As in Virginia, Florida participants have been “enthusiastic even though it’s a resource drain,” says Sullivan. A common concern has been establishing the security of the data. Many of the participating chief information officers had “never heard” of LOINC, indicating that there is a need for education about this increasingly important data format, Sullivan notes.

As part of its AHRQ planning grant, the Washington Department of Health held two meetings in different parts of the state. Participants from facilities in the western, more urbanized region tended to be more enthusiastic about the possibility of using clinically enhanced administrative data in quality reporting. Hospitals in the eastern half of the state, which tended to be smaller and less likely to have EHR systems, were not as keen on the concept, according to Joe Campo, MPH, research section manager for the department’s Center for Health Statistics.

“They were unsure how the data would be collected since a number of them don’t have electronic lab reporting systems,” Campo explains. “They thought it would entail more hand-coding and didn’t think there would be a quick enough turnaround on reporting.” AHRQ researchers estimate that 80 percent of hospitals nationally have computerized laboratory systems.

Washington, Virginia, and Florida do not have any plans at the present time to require reporting of clinically enhanced administrative data, say Campo, Lundberg, and Sullivan. However, all believe participation in the AHRQ grants is giving those involved valuable insight and experience should such a system ever be implemented. “This is a new wave of holding hospitals accountable,” says Sullivan. “It’s a much more accurate assessment of hospital quality.”

Even as the AHRQ initiatives wind down and the NQF endorsement project begins, institutions already incorporating elements of EHR into quality reporting believe many issues of importance in their efforts would carry over to the reporting of clinically enhanced administrative data.

For instance, Lisa Brooks Taylor, RHIA, is part of the team that implements evidence-based guidelines at Resolution Health, headquartered in Columbia, MD. As a clinical terminology analyst, she identifies the diagnostic, procedural, and LOINC code sets for the algorithms associated with each guideline.

“You have to stay on top of regulatory and coding changes and understand coding systems and how the data is transmitted and should be used,” she says. HIM has a particularly important role in “explaining the limitations of data” in the context of quality reporting, she notes, so that there is a clear understanding of measures that reflect systemic rather than individual provider performance.

Transition to either clinically enhanced administrative data or pure EHR-based data for quality reporting does not change the essential dynamics involved, according to Cleveland Clinic’s Matthews. “Doctors still have to document properly, and coders have to be knowledgeable and work with physicians to make sure that we correctly capture the essence of the clinical experience.”

Gina Rollins (rollinswrites@verizon.net) is a freelance writer specializing in healthcare.

More Information on the AHRQ and NQF projects is available online:

AHRQ: www.hcup-us.ahrq.gov/reports/clinicaldata.jsp

National Quality Forum: www.qualityforum.org/projects/ongoing/enriched-claims/index.asp

 


Article citation:
Rollins, Gina. "Improving the Quality of Quality Measures: Pilots Show Promising Results from Clinically Enhanced Administrative Data" Journal of AHIMA 80, no.4 (April 2009): 20-25.