Quality improvement initiatives can increase echocardiography reproducibility

April 30, 2015 – Melissa Daubert, MD, and other Duke researchers developed an operational framework that included individual review and group-based sessions.

Use of a continuous quality improvement initiative can reduce variability and improve reproductively in echocardiography, according to a new study by DCRI researchers.

melissa-daubert-newsEchocardiography is a valuable tool for diagnosing cardiovascular ailments, but the wide variability in interpretation can limit its usefulness in a clinical setting. The American Society of Echocardiography has issued a set of general guidelines for echocardiography interpretation, but to date there are no well-defined, widely accepted standards that define acceptable reproducibility.

The study, published online this month in the Journal of the American Society of Echocardiography, employed a new operational framework for interpreting echocardiography. Lead author Melissa Daubert, MD (pictured), and her colleagues hypothesized that this quality improvement initiative would improve interpretative variability.

The researchers gathered a team of five echocardiogram readers, which included three registered diagnostic cardiac sonographers and two cardiologists with board certification in adult echocardiography. These readers were asked to independently interpret 10 transthoracic echocardiograms for the following parameters: left ventricular end-diastolic volume, biplane ejection fraction, mitral regurgitation, aortic regurgitation, left ventricular outflow tract, diameter, peak and mean aortic valve gradients, and aortic valve area.

Reproducibility was evaluated by comparing the variability of each parameter. If at least 80 percent of comparisons fell within a range of acceptable difference, defined for each parameter by a literature review and results from prior reproducibility testing, reproducibility was considered acceptable.

After the initial interpretation session, the five readers underwent retraining. This process included individual review and group-based sessions led by a senior cardiologist and lead sonographer. The trainers used national guidelines and case studies to promote uniformity of interpretation, eliminate individual idiosyncrasies, and provide an open forum for questions.

Upon completion of the retraining program, the readers interpreted a different set of 10 test echocardiograms for parameters that had unacceptable reproducibility on the initial testing, and reproducibility was reevaluated. To assess the effectiveness of the retraining program, the process was repeated with the same readers one year later.

All of the readers demonstrated acceptable reproducibility for biplane ejection fraction, mitral regurgitation, and peak and mean aortic valve gradients. Four of the original five readers achieved acceptable reproducibility for left ventricular end-diastolic volume, aortic regurgitation, and aortic valve area. None of the readers achieved acceptable reproducibility on the initial evaluation of left ventricular outflow tract diameter. After the retraining sessions, all of the readers demonstrated acceptable reproducibility, which was maintained on subsequent testing one year later. A second group of 10 readers was also evaluated, with similar results.

These findings, the researchers concluded, demonstrate that a quality improvement initiative for echocardiographic reproducibility can be successfully implemented with a minimal investment of time and money.

The study’s other authors were Eric Yow, MS; Huiman Barnhart, PhD; Dawn Rabineau, RDCS;, Anna Lisa Crowley, MD; and Pamela Douglas, MD.