Computer-Assisted Coding Reality Check

By Mary Butler

As text messaging has come to replace telephone calls as the primary means of interpersonal communication, apps such as SwiftKey have popped up to help those with less than nimble thumbs. SwiftKey and other “intelligent keyboards” quickly learn a user’s writing and typing style. They operate in the background of a smartphone and take note of frequently used expressions, punctuation, emojis, and slang in a user’s text messages, emails, and social media posts. Before long, they’re able to predict what a user is trying to say and autosuggests enough words that it drastically reduces the time it takes to type out a message.

While these apps are arguably helpful and “intelligent,” they do require a human’s touch to succeed—and they aren’t without drawbacks. Before a user realizes it, “chicken noodle soup” can be autocorrected to “Chuck Norris soup.” The internet is full of enough #autocorrectfails that savvy users know to slow down a bit to avoid embarrassing typos. Human beings understand that texts sent to bosses and colleagues require more care than a quick note to friends or significant others.

Computer-assisted coding (CAC) occupies a similar function in the lives of coding and health information management (HIM) professionals. CAC software uses natural language processing (NLP) to extract and translate transcribed free-text data or computer-generated discrete data into information for billing and coding purposes. Over time, the software picks up on a coding professional’s frequently used codes—especially when used in a specialty hospital—and quickly becomes more precise, learning from instances when a coding professional overrides the CAC’s suggested code with one that’s more accurate.

Like apps that make texting faster, CAC’s success is contingent on the reasoning, knowledge, and editing skills of the human beings who use it. Before ICD-10-CM/PCS went live in 2015, CAC was hailed by many in the industry as a miraculous tool for preventing massive coding slowdowns that some predicted the new code set would unleash.

Since that time, however, reality has set in and tempered the expectations of coding professionals and the many CAC vendors that promised life-changing results. With the ICD-10 transition in the rear-view mirror, it’s time to re-evaluate the following promises CAC initially offered: that it would improve coding accuracy and documentation quality; that it would increase productivity; that it would reduce the need for coders and transition others into coding auditors; that it would provide a positive return on investment; and that CAC could make intelligent, human-free decisions based on documentation.

Expectations Meet Reality

In the nearly four years since ICD-10 has been in place, there has been no evidence to suggest that CAC will be replacing the need for coding professionals any time soon. But that’s not to say it’s been completely unhelpful. In fact, CAC has helped providers in expected ways. In 2013, the AHIMA Foundation worked on a study with the Cleveland Clinic, with funding from CAC vendor 3M, to predict how the use of CAC technology would impact accuracy and productivity with ICD-10.1

The AHIMA Foundation was able to validate that the time it took the study’s coding professionals to code inpatient records using CAC was significantly shorter than those coding professionals who didn’t use the technology, resulting in a 22 percent reduction in time per record. Additionally, it found that Cleveland Clinic was able to reduce the time it took to code without decreasing quality as measured by recall and precision for both procedures and diagnoses. 

For Monica Pinette, MBA, RHIA, CDIP, CCS, CPC, now the assistant vice president of HIM at UConn Health, the AHIMA Foundation’s findings weren’t all that different from what she found when she was preparing for the ICD-10 transition with CAC at a previous employer, St. Francis Hospital and Medical Center in Hartford, CT. While at St. Francis, Pinette led her coding team through extensive training with CAC prior to the ICD-10 transition in 2015 with the expectation that the new code set would slow them down, especially when coding procedures.

Pinette says the industry standard for the number of charts coded, per hour, was 2.5 records using ICD-10. However, her coding staff was easily able to code three or four charts per hour with CAC. “Even though we had the implementation of ICD-10 and it was predicted we’d slow down, CAC helped us avoid productivity losses. Coders were able to exceed their expectations,” Pinette says.

The CAC software also helped coding professionals familiarize themselves with the new code set more quickly. “With CAC it would actually highlight procedure codes and diagnosis codes and slate them for you. Then, coders could use the CAC’s evidenced-based feature where you could go back and validate the procedures and diagnosis codes [suggested by the CAC engine]. And in a way it kind of helped to teach the coders by seeing those codes over and over again,” Pinette says.

Her facility used CAC for both outpatient and inpatient coding but says it was the most beneficial on the inpatient side because inpatient coding professionals have the additional challenge of assigning PCS codes and choosing DRGs.

Working with Limitations

Like many people, Pinette says her coding professionals were concerned, at first, that CAC would be so useful that it would replace them, but it quickly became clear to them this wouldn’t be the case.

“I think people with less knowledge of coding operations think ‘Oh, CAC does the coding for you’ but that’s not true at all. It does take human intervention because not every code that is given by the CAC is necessarily correct or needed for coding accuracy and ensuring the bill goes out on the claim appropriately. It does take human intervention and analysis on the outpatient side to look at edits and things like that in addition to using the CAC feature,” Pinette says.

Deanna Klure, RHIT, CCS, CDIP, director, coding education, nosology, CAC/clinical documentation improvement (CDI) business applications at Kaiser Permanente, stresses that it’s important that coding professionals and their managers remember that CAC is just a tool—a very effective one—but a tool that’s as fallible as the humans that use it.

For example, on a given chart the CAC may autosuggest 10 codes and the user may accept only eight because the other two are irrelevant. Perhaps the doctor dictated that the patient does not have pneumonia, but the CAC missed the word “not” and autosuggested pneumonia anyway. The user has to use the CAC’s “evidence-based” feature to determine why pneumonia is suggested before they can accept or override it.

“Sometimes what providers will do, they’ll make a template and it will have check boxes ‘yes’ ‘no’ ‘yes’ ‘no’ and the NLP can’t read the checkbox,” Klure says.

The longer a coding professional uses CAC, the more accurate the NLP engine becomes, but the technology still has a long way to go before artificial intelligence (AI) can replace a trained coding professional or become obsolete. Sarah Goodman, MBA, CHCAF, COC, CCP, FCS, president, CEO, and principal consultant for SLG Consulting, believes AI has actually enhanced CAC as it becomes more integrated. But she also thinks that automated coding and AI-assisted audits are likely the wave of the future.

Klure agrees. “I don’t think it’ll [CAC] ever be obsolete [due to AI]. I do think there’s something it can get really good at but it also depends on templating. Some procedures, we call them candy, are just easy coding,” Klure says, using GI-related charts as an example. “You’re using five of the same ICD-10-CM codes and CPT codes a lot and it becomes very easy. If you can standardize the templates you can get much more precision. If the templates were standardized and readable in the CAC engine and NLP engine, certain procedures could be autosuggested at a high degree of precision.”

But even if providers really took the time to develop templates to improve precision, it would require massive industry-wide collaboration to get templates, CAC vendors, and electronic health record (EHR) system vendors to a place where CAC could replace people, Klure says.

There are some places, however, where CAC hasn’t been as seamless to incorporate as it has been for Klure and Pinette.

Robin Andrews, M.Ed, RHIA, CCS, director of HIM, coding, and CDI at Steward Health Care, has been an HIM professional for 43 years. She uses her facility’s CAC on surgical charts and feels that CAC has negatively impacted her productivity. In Andrews’ experience there was a huge disconnect in the way the CAC was advertised to her facility and how it actually performs.

“I personally was under the impression that it was like magic—that you could just turn it on and it would read the document and find the codes and you’d go on your merry way. I’ve been working with CAC for three to five years now and it’s hard to build because it’s not always picking up the accuracy of the diagnosis,” Andrews says. “And if it sees things like abbreviations it’s going to put a code on the abbreviation but it could be just a title. It also doesn’t get to the finer details of a code. Now this CAC system will plop codes right beside words or diagnoses. And personally I don’t trust that it’s going to be as accurate as I can make it be because of my skillset.”

Andrews may not be alone in her assessment that CAC hasn’t resulted in the experience some feel they were promised. Some EHRs are not configured in a way that makes CAC easy to use. Older EHRs have had to incorporate lots of PDFs that are harder for CAC engines to read. And some providers have EHRs, encoders, and CACs from three different vendors that may not interface well together, resulting in a less than efficient CAC interface—and providers that made a big investment with a CAC vendor may be unwilling to look for one that works better.

Indeed, a report by KLAS Research2 found that providers who document with a hybrid of electronic and paper systems have seen that the return on investment for CAC is not as high as they’d like it to be—likewise, the more electronic a provider is the more successful the CAC tends to be. KLAS advises providers to help members of their organization understand that the process of implementing CAC could be long and they need to be committed to robust training and onboarding.

CAC and the Future

Market research and vendors themselves anticipate a growing market for CAC products. In 2018, the global CAC market was valued at $2.8 billion—and it is expected to reach $5.1 billion by 2023, according to a report from WinterGreen Research.3 They noted that a smooth transition to ICD-10 has helped an increasing number of providers decide to invest in software that can maximize the data created by the new code set.

Heather Gladden, CCS, CAC product specialist at Dolbey, says that prior to ICD-10 there was a huge upswing in the number of providers looking at CAC systems, which paused around the time ICD-10 was implemented. But she says now that many providers have realized they have a handle on ICD-10, interest is growing again.

“We’ve seen a huge uptick in people looking at CAC, over the last year and a half and also because of adoption of EMRs [EHRs],” Gladden says. “They had EMRs early on and now some organizations are switching out their EMRs. We saw a lot of organizations say ‘We’re in the middle of upgrading our EMRs and working on value-based purchasing (VBP),’ so they were working on quality measures. In the last year we’ve seen a lot of people looking.”

As providers become more comfortable with CAC systems, they’re finding the software helps organizations improve the quality of their coded data, which in turn helps them improve their case mix index, decrease payment denials, shorten accounts receivable days, participate in VBP and bundled payment initiatives, and even identify patients that are at risk for readmissions, according to Gladden. Having more accurate and reliable data also helps organizations when and if they need to defend their data against auditors. She says she has also seen organizations looking into CAC not just for coding, but also for a collaborative workspace for clinical documentation improvement, quality, ancillary departments, and internal auditors. CAC can also empower coding teams, whether they are coding the patient chart concurrently, or at the time of discharge, providing them with a comprehensive workspace and the tools to enable coding professionals to complete work in less time with more accuracy.

CAC also helps organizations improve the quality of their coded data. It can even help providers track hospital-acquired infections, patient safety indicators, and 30-day admits, according to Kristi Fahy, RHIA, who is an account executive at DVS, a premier partner of Dolbey. She says providers who aren’t using CAC are leaving money on the table. “Pay-for-performance and quality based on VBP… all those initiatives have to have good coding otherwise they’re not going to get reimbursed appropriately,” Fahy says.

She notes that one Dolbey client with nine hospitals went from coding 20 inpatient charts per day to 30 charts per day with CAC. They had similar improvement in emergency department coding, which improved from 100 charts per day to 175 charts per day with CAC. These productivity rates are based on an eight-hour work day.

“The data is really there to show that productivity. The same site had an external auditor come in and the auditors confirmed that the quality had really improved with the codes,” Fahy says.

Asked if facilities with CAC are better off than facilities without, SLG Consulting’s Goodman says it depends on how well CAC is implemented and monitored by credentialed coders.

“The reality is that while CAC is excellent at analyzing key words and suggesting codes, human intervention is still necessary, and as with any successful implementation, it always comes down to three things: people, process, and technology,” Goodman says. “If these are integrated appropriately, then CAC can work effectively.”

Notes

  1. Dougherty, Michelle; Sandra Seabold; and Susan E. White. “Study Reveals Hard Facts on CAC.” Journal of AHIMA 84, no. 7 (July 2013): 54-56. http://library.ahima.org/doc?oid=106668.
  2. KLAS Research. “Computer-Assisted Coding 2016: Who Is Delivering Promised Value in ICD-10?” August 16, 2016. https://klasresearch.com/report/computer-assisted-coding-2016/1111.
  3. WinterGreen Research. “Computer Assisted Coding: Market Shares, Strategy, and Forecasts, Worldwide, 2017 to 2023.” March 13, 2017. www.wintergreenresearch.com/computer-assisted-coding.

Mary Butler (mary.butler@ahima.org) is associate editor at the Journal of AHIMA.

Article citation:
Butler, Mary. “Computer-Assisted Coding Reality Check.” Journal of AHIMA 90, no. 6 (June 2019): 10-13.