Blog | Friday, May 4, 2018

Great news from CDC! Help me understand it


Today our colleagues at the CDC's Division of Healthcare Quality Promotion (DHQP) and the Tennessee and Iowa Departments of Public Health released an interesting MMWR Vital Signs report highlighting CDC's efforts to control emerging antibiotic resistance (AR) phenotypes (main focus being carbapenem-resistance in Enterobacteriaceae (CRE) and Pseudomonas aeruginosa (CRPA). The release is accompanied by a helpful summary of CDC's efforts to better coordinate the detection and control of new and emerging resistance threats (including, of course, not only ESBL and CRE but Candida auris, colistin resistance, glycopeptide-resistance in S. aureus, etc.). Check it out!

My major takeaway is that investments in laboratory detection of AR are beginning to pay off. By improving their diagnostic capabilities, more regional and state labs are now able to help health care facilities confirm, characterize and respond to pathogens that display unusual resistance phenotypes. This is unquestionably a good thing, though it still requires local facilities to be able to detect phenotypes of concern—for example, many hospitals don't have ready access to MALDI-TOF or sequencing, and many do not identify Candida to species level routinely from non-sterile sources. How long might C. auris spread in such a facility prior to its detection? So although the AR Lab Network (ARLN) is a major step forward, we still need to emphasize that individual hospitals must invest in improved and timely AR diagnostics and we ought to ensure they receive updated guidance about how best to detect new or emerging pathogens of concern.

Now for the data itself: the report presents NHSN CAUTI and CLABSI data from 2006-2015 on the percentage of all isolates of E. coli and K. pneumoniae with ESBL and CRE phenotypes over time. The ESBL phenotype was consistently detected in 16-19% of isolates over time, with no major change noted. However, the percentage of CRE declined steadily, from a peak of 10.6% in 2007 down to 3.1% in 2015. This is despite the fact that CLSI breakpoints for carbapenem susceptibility were lowered during the surveillance period, which should have caused more isolates to meet phenotypic criteria for resistance. And although the CDC published CRE-specific control guidance in 2009, the decline in the percentage-resistant preceded this guidance (which would obviously not have been immediately implemented across the country, likely delaying any impact for 1-2 years). This also coincides with the spread of carbapenemase-producing Enterobacteriaceae (CPE) across the U.S. So I'm left wondering why is the percentage CRE declining as a cause of CAUTI and CLABSI in acute care hospitals?

Is it related to the general improvements in infection prevention practices that have accompanied public reporting and pay-for-performance? Would such improvements differentially impact CRE over other pathogens (keep in mind this isn't a rate, it's a proportional decrease)? It's clear that the epicenter of CRE/CPE seems not be in acute care but rather “post-acute care” as is also noted in the other part of the report (the results of CPE screening by ARLN labs for 9 months of 2017) but I'd still expect increases in post-acute care or other health care settings to eventually be reflected in the NHSN CAUTI/CLABSI data. Granted, the NHSN data represents a relatively small number of organisms (for example, Clare Rock and colleagues demonstrated that CLABSI represent only 6% of all hospital-onset bacteremias).

I remain perplexed. And happy to hear any thoughts on this!

Daniel J. Diekema, MD, FACP, practices infectious diseases, clinical microbiology, and hospital epidemiology in Iowa City, Iowa, splitting time between seeing patients with infectious diseases, diagnosing infections in the microbiology laboratory, and trying to prevent infections in the hospital. This post originally appeared at the blog Controversies in Hospital Infection Prevention.