Title | Grahf, Jessica_MSRS_2020 |
Alternative Title | A Quality Improvement Study: Preliminary DXA Drafting by Radiology Physicians Extenders |
Creator | Grahf, Jessic |
Collection Name | Master of Radiologic Sciences |
Description | Dual-energy x-ray absorptiometry (DXA) is a diagnostic tool used to measure bone mineral density (BMD), assess fracture risk, and diagnose musculoskeletal diseases, such as osteoporosis. Unfortunately, DXA exams are prone to error, leading to incorrect diagnosis and unnecessary treatment regimens. The purpose of this quality assurance project was to determine if different methods of DXA interpretation, radiologist drafting, and preliminary drafting by physician extenders for radiologist approval, contribute to the reduction of DXA errors. Retrospective analysis of 400 exams, 200 for each interpretation method, found no statistically significant correlation between the number of categorical errors reported and interpretation methods. Error categories included studies with errors, BMD change errors, and analysis errors. Of the 200 exams interpreted by radiologists, 39.5% contained at least one error, 23.5% had BMD change errors, and 36% had analysis errors. Physician extender (PE) drafted reports contained fewer errors in all categories. A review of the 200 exam PE sample set found at least one mistake in 30% of exams, BMD errors in 14.5% of studies, and analysis errors in 22.5%. In conclusion, physician extender DXA interpretation enables radiologists to focus on critical exams, decreases radiologist interruptions, preserves exam accuracy, and promotes potential quality improvement. |
Subject | Medical radiology |
Keywords | Duel-energy x-ray absorptiometry; DXA |
Digital Publisher | Stewart Library, Weber State University |
Date | 2020 |
Language | eng |
Rights | The author has granted Weber State University Archives a limited, non-exclusive, royalty-free license to reproduce their theses, in whole or in part, in electronic or paper form and to make it available to the general public at no charge. The author retains all other rights. |
Source | University Archives Electronic Records; Master of Science in Radiologic Science. Stewart Library, Weber State University |
OCR Text | Show DXA Drafting Quality Improvement 2 Acknowledgments I want to acknowledge my husband, who has been my rock throughout this program and crazy 2020 year. Mom and Dad, thank you for being my biggest cheerleaders! I love all of you and could not have accomplished this achievement without your love and support! DXA Drafting Quality Improvement 3 Table of Contents Abstract…………………………………………………………………………………………..6 Literature Review………………………………………………………………………………..7 DXA History…………………………………………………………………………………..7 DXA Related Errors…………………………………………………………………………..9 Old Solutions…………………………………………………………………………………12 New Possibilities……………………………………………………………………………..13 The Future of DXA Quality………………………………………………………………….15 Purpose …………………………………………………………………………………………16 Theoretical Framework………………………………………………………………………17 Methods…………………………………………………………………………………………18 Sample Overview…………………………………………………………………………….19 Instrumentation……………………………………………………………………………….20 Data Analysis…………………………………………………………………………………22 Results…………………………………………………………………………………………...22 Discussion……………………………………………………………………………………….30 Recommendations…………………………………………………………………………….32 Limitations…………………………………………………………………………………....32 Conclusion……………………………………………………………………………………33 References……………………………………………………………………………………….34 Appendix A……………………………………………………………………………………...38 DXA Drafting Quality Improvement 4 List of Figures Figure 1. Final Data Collection Tally Sheet Sample Figure 2. Percentage of Errors According to Interpretation Methods Figure 3. Comparison of Analysis Errors Categorized by ROI Figure 4. Comparison of BMD Change Errors in 2015 and 2019 Data Sets Figure 5. Percentage of 200 Reviewed Exams Drafted by Radiologists and PEs Figure 6. Comparison of Hip Images DXA Drafting Quality Improvement 5 List of Tables Table 1. Tally Sheet Summary of Categorical Errors Resulting from Two Interpretation Methods Table 2. Analysis Errors Crosstabulation Table 3. Analysis Errors Chi-Squared Tests Table 4. BMD Change Errors Crosstabulation Table 5. BMD Change Errors Chi-Squared Tests Table 6. Studies with Errors Crosstabulation Table 7. Studies with Errors Chi-Squared Tests Table 8. Types of Analysis Errors and BMD Change Errors According to Interpretation Methods DXA Drafting Quality Improvement 6 Abstract Dual-energy x-ray absorptiometry (DXA) is a diagnostic tool used to measure bone mineral density (BMD), assess fracture risk, and diagnose musculoskeletal diseases, such as osteoporosis. Unfortunately, DXA exams are prone to error, leading to incorrect diagnosis and unnecessary treatment regimens. The purpose of this quality assurance project was to determine if different methods of DXA interpretation, radiologist drafting, and preliminary drafting by physician extenders for radiologist approval, contribute to the reduction of DXA errors. Retrospective analysis of 400 exams, 200 for each interpretation method, found no statistically significant correlation between the number of categorical errors reported and interpretation methods. Error categories included studies with errors, BMD change errors, and analysis errors. Of the 200 exams interpreted by radiologists, 39.5% contained at least one error, 23.5% had BMD change errors, and 36% had analysis errors. Physician extender (PE) drafted reports contained fewer errors in all categories. A review of the 200 exam PE sample set found at least one mistake in 30% of exams, BMD errors in 14.5% of studies, and analysis errors in 22.5%. In conclusion, physician extender DXA interpretation enables radiologists to focus on critical exams, decreases radiologist interruptions, preserves exam accuracy, and promotes potential quality improvement. Keywords: Dual-energy x-ray absorptiometry (DXA), densitometry, osteoporosis, musculoskeletal health, bone mineral density (BMD), interpretation, physician extenders (PE) DXA Drafting Quality Improvement 7 Literature Review Dual-energy X-ray absorptiometry (DXA) is the imaging modality of choice for measuring bone mineral density. DXA reports are often interpreted by radiologists and used by referring physicians to treat low bone density and osteoporosis. Patients are likely unaware that imaging and interpretation errors related to DXA studies are pervasive and often overlooked. The following literature review summarizes quality control concerns that have plagued DXA for more than three decades and highlights potential solutions to achieving DXA report accuracy and proper treatment for patients with compromised bone density. DXA History Dual-energy X-ray absorptiometry has been evolving for over 30 years. Like many medical advances, DXA technology has faced challenges and critics. Still, it continues to remain the "Gold Standard" for testing bone mineral density, despite the high error rate reported by experts in the field.1 Technological and interpretation errors have the potential to affect patient care and can lead to harmful, unnecessary treatments. The following literature review highlights DXA's error-prone areas and what little has been done to improve quality and accuracy. In an article commemorating 30 years of DXA, Lewiecki and Binkley (2017) reflected on the early DXA pioneers of the 1800s and the single-photon absorptiometry invention in 1963 by John Cameron and James Sorenson. The author's credited David Ellenbogen and Jay Stein for founding Hologic, the company that introduced DXA to the industry. Lewiecki and Binkley's article acknowledged the milestone in 1988 when DXA technology first received FDA approval. Paul Miller (2017) provided additional insight into DXA history by documenting the 1990 creation of the Society of Clinical Densitometry (SCD). The society guided bone densitometry performance and interpretation. Miller detailed SCD's accomplishments, such as the publication DXA Drafting Quality Improvement 8 of the 1st manuscript on bone density clinical interpretation guidelines in 1995. Miller's article reflected on the implementation of DXA certification courses and educational lectures for physicians and the first annual SCD meeting in Virginia with 300 attendees in 1996. According to Miller, in 1997, the SCD changed its name to the International Society of Clinical Densitometry (ISCD) after Brazilians embraced the Society and sparked global expansion. Accreditation programs, implementation of the peer-reviewed Journal of Clinical Densitometry, and pediatric assessment programs are just a few of ISCD's accomplishments that Miller highlighted in his article.2 The world health organization (WHO) has also helped shape DXA's future on the global stage. In their article titled "The role of DXA bone density scans in the diagnostic treatment of osteoporosis," Blake and Fogelman (2007) discussed the past guidelines set forth by the WHO and the new algorithm for patient treatment based on fracture risk assessment. The authors listed the many advantages of central DXA, including low radiation dose, availability of reliable reference ranges, and good precision, to name a few. They also provided a detailed explanation of T and Z-score calculations and creation of the NHANES III reference database. Blake and Fogelman referenced the National Institute for Health and Clinical Excellence's (NICE) guideline development. They predicted that hip BMD exams would be used for treatment determination, and Spine BMD results could help monitor patient responses to treatment.3 In 2010, negative media attention and misreported facts thrust DXA into the spotlight in the United States. According to an article by Lewiecki, Binkley, and Bilezikian (2018), titled "Stop the War on DXA!", the fallout from adverse reports may have hurt the DXA industry while shedding light on some questionable truths. The authors suggested that misleading media reports led to fewer facilities offering DXA scan technology and resulted in an increasing number of hip DXA Drafting Quality Improvement 9 fractures and related deaths between 2012- 2015. Lewiecki, Binkley, and Bilezikian's article debunked media quotes that DXA is "overutilized, too expensive and not helpful in patient management." The authors also touched on the quality challenges that lead to poor DXA results, stating "inadequate instrument maintenance, incorrect patient positioning, substandard analyses, and failure to understand current standards for interpretation and reporting can all contribute to a poor test." 4 DXA Related Errors Similar quality challenges have been well documented in many DXA articles. Lewiecki and Lane (2008) provided a thorough overview of common mistakes encountered with bone mineral density (BMD) testing and suggested strategies to reduce mistake frequency. The categories of error discussed in this article included exam indication, quality control, acquisition, analysis, and interpretation. Lewiecki and Lane (2008) incorporated a well-organized summary table to document potential DXA pitfalls. The authors also described the consequences of poorly executed exams, such as unnecessary or contraindicated therapeutic recommendations for patients. Although this article is more than a decade old, the information remains valid and should be used as an educational resource for all healthcare providers, especially those interpreting DXA exams, prescribing treatment based on DXA results, and technologists performing DXA scans.5 Articles by Morgan and Prater (2017), Watts (2004), and El Maghraoui and Roux (2008) all warned that DXA related errors, as a result of inattention to detail, could affect patient diagnoses and result in potentially harmful therapies.6,7,8 The article by Watts (2004), "Fundamentals and pitfalls of bone densitometry using dual-energy X-ray absorptiometry (DXA)," narrowed DXA related errors into four categories, incorrect patient data entry, poor DXA Drafting Quality Improvement 10 patient positioning, scan mode errors, and software analysis errors.7 Morgan and Prater examined the sources of error even further and summarized over a dozen studies and trials conducted over 20 years related to DXA errors.6 Articles by Morgan and Prater, Watts, El Maghraoui, and Roux provide images and descriptions of proper DXA positioning, positioning errors, and examples of analysis errors. Many case examples related to the lumbar spine and internal or external artifacts, incorrect vertebral level comparisons, degenerative changes, poor positioning, or inclusion or exclusion of vertebral bodies. 6,7,8 Hips and lumbar vertebral bodies are the primary focus of most DXA scans. Therefore, it is no surprise that several published DXA studies focus on errors related to lumbar spine analysis. Rand, Seidl, Kainberger, et al. (1997) studied the effects of degenerative lumbar spine changes on bone mineral density values of 144 postmenopausal women between 55 and 70 years old. The authors found that degenerative changes in the form of osteophytes, vascular calcifications, and osteochondrosis all influenced BMD measurements. Degenerative changes also appeared to increase in prevalence and severity with age. Rand, Seidl, Kainberger, et al. recommend using radiographic lumbar spine evaluations, lateral lumbar DXA images, and placing more emphasis on femoral neck values in women over 60 with lumbar degenerative changes.9 Similar findings were reported in a 2013 study that evaluated bone density in the same group of women over 15 years. Tenne, McGuigan, Besjakov, Gerdhem, and Akesson (2013) evaluated the bone mineral density of 1,044 75-year-old women and performed follow-up DXA testing at ages 80 and 85. The results demonstrated that lumbar spine degenerative changes increase BMD values, especially when measuring lower level vertebral bodies. The 2013 study also showed that degenerative vertebral body exclusion during DXA analysis increased DXA Drafting Quality Improvement 11 osteoporosis diagnoses. In conclusion, the study found that BMD in older women increased or remained stable between ages 75 and 85 due to degenerative changes.10 Some of the published studies focused on DXA errors found alarmingly high error rates. A Turkish research project by Karahan, Kaya, Kuran, et al. (2016) focused on common DXA mistakes in four specific areas. The categories examined included "1) indication errors, 2) lack of quality control and calibration, 3) analysis and interpretation errors, and 4) inappropriate acquisition techniques." Inclusion criteria for the retrospective analysis required each study to have a lumbar spine and hip measurement. Pediatric DXA exams, forearm analysis, and total body measurements were excluded from the study. Karahan, Kaya, Kuran, et al. (2016) found that 31.8% of lumbar spine evaluations and 49.0% of proximal femur exams contained errors. The authors seemed to blame the "seriously high error rates" on the lack of educational standards for DXA operators.11 Another study focused on DXA errors was performed at the University of Milan. Messina, Bandirali, and Sconfienza et al. 's (2015) European study examined baseline DXA exams of 38 men and 447 women between the ages of 59 and 77. The study's error categories included data analysis, patient positioning, demographics, and artifacts. The study found more than 90% of the DXA exams evaluated had at least one error. More than a third of those errors were related to data analysis, and patient positioning was the 2nd most common error.12 A more recent study performed in the United States by Krueger, Shives, Sigilinski, et al. (2019) hypothesized, "DXA errors are common and of such magnitude that incorrect clinical decisions might result." The authors conducted a 2-phase study to determine if a reporting template would lower the rate of DXA interpretation errors. Error analysis was conducted on 345 DXA scans by ISCD certified technologists and physicians in phase 1. During phase 2, the same DXA Drafting Quality Improvement 12 physicians reviewed 200 reports created using the reporting template. Phase 1 results found technical errors on 90% of the exams and interpretation errors on 80% of exams. Inaccurate reporting of BMD change accounted for 70% of interpretation errors, followed by the 2nd most common error, an incorrect diagnosis, which accounted for 22% of exams. Krueger, Shives, Sigilinski, et al. reported that 13 % of the technical errors contributed to interpretation errors in phase 1. They also found that the reporting template's implementation in phase 2 reduced interpretation errors by 66%. Krueger, Shives, Sigilinski, et al. admitted that more process improvement studies are needed to investigate methods for improving DXA acquisition and analysis.13 Old Solutions Krueger, Shives, Sigilinski, et al. 's study stands out from the other research because the authors implemented a way to improve DXA quality and proved that it worked. Another report by Wachsmann, Blain, Thompson, Cherian, and Browning (2018) evaluated the benefits of electronic data transfer versus manual data entry in the field of DXA. They described the manual entry of numerical DXA values through voice recognition software as an error-prone practice. The authors examined 200 preliminary DXA reports. One hundred reports were created with manual data entry, and one hundred utilized electronically imported data. Wachsmann, Blain, Thompson, Cherian, and Browning (2018) concluded that electronic data transfer improved both the accuracy and efficiency of DXA reporting.14 The recent studies by Krueger, Shives, Sigilinski, et al., and Wachsmann, Blain, Thompson, Cherian, and Browning suggested there is hope for quality improvement in the field of DXA. Interestingly, expert suggestions for improving DXA quality did not mention implementing a reporting template or electronic data transfer.13,14 For example, articles published DXA Drafting Quality Improvement 13 by Lewiecki and Binkley, Lewiecki, Binkley, and Bilezikian, Lewiecki and Lane, Lewiecki, Binkley, Morgan, et al. (2016) and Carey and Delaney (2017) all emphasize the importance of quality DXA standards and the potential impact interpretation errors may have on patients.1,4,5,15,16 Lewiecki, Binkley, and Bilezikian's article explains that reliable DXA results are achievable if technologists and radiologists engage in the International Society for Clinical Densitometry (ISCD) educational courses and follow (ISCD) guidelines during DXA acquisition and interpretation.1 Krueger, Shives, Sigilinski, et al. refer to one study that demonstrated training and continuing education positively impacted DXA analysis and acquisition.12 However, according to Lewiecki, Binkley, Morgan, et al., "US state and local regulations do not require any specific qualifications for DXA interpretation."15 Morgan and Prater's report addressed the lack of regulation by suggesting pay for performance DXA standards to improve DXA quality how the Mammography Quality Standards Act has elevated quality in the field of mammography.6 New Possibilities As the experts continue to promote education, certification, and quality standards, Krueger, Shives, Sigilinski, et al. reported that DXA interpreters failed to recognize 15% of technical errors and said, "this likely demonstrates either a lack of education on the part of technologists and interpreters or a low allocation of time or resources to optimize quality." 13 In her article titled Radiologists Burnout: Mission Accomplished, radiologist, Dr. Czum (2019), discussed how a gradually increasing workload eventually leads to increased interpretation speed, cutting corners, and longer hours for radiologists. Czum states, "As work intensifies, compounded by focus-distracting interruptions, you experience fatigue, make more mistakes, and have less time for teaching, research, multidisciplinary conferences, committee work, self-directed learning, exercise, family, hobbies, volunteering, sleep. If the quality of your work and your life suffers, DXA Drafting Quality Improvement 14 patients, pay the price." 17 If time allocation or resources contribute to the plethora of DXA errors, preliminary DXA interpretation by physician extenders could save radiologists time and promote accuracy in DXA acquisition and analysis. Physician extenders' role in radiology is summarized beautifully in a literature review by Vicki Sanders and Jennifer Flanagan (2015). The midlevel providers included in the article were Nurse Practitioners (NP's), Physician Assistants (PA's), Radiologist Assistants (RA's), and Nuclear Medicine Advanced Associates (NMAA's). Sanders and Flanagan attribute the increased demand for healthcare services and the shortage of available physicians to the need for physician extenders. The author's literature review discusses the history of the four provider types and summarizes their roles in a table titled "Clinical Duties and Procedures Performed by Physician Extenders in Medical Imaging." The duty of preparing initial image observations and reports falls under the practice scope of all physician extenders. Sanders and Flanagan note that PAs, NPs, and RAs are being utilized in a variety of ways to increase radiologist efficiency and to provide care for patients.18 Physician extenders' broad scope of practice allows for flexibility and versatility in the world of radiology. Wright, Killian, Johnston, et al. (2008) conducted a study to document radiology physician extenders' value and impact on radiologist productivity. The literature review for this study predicted that the roles of Registered Radiologist Assistants (RRAs) would be increasingly important as fewer radiologist residents joined the workforce to replace retiring radiologists. Wright, Killian, Johnston, et al. found that the implementation of RRAs saved the radiologists 100.27 minutes per day, which resulted in 100.27 additional minutes of image interpretation and $240,242.50 in image interpretation revenue annually. Patient satisfaction within radiology departments also improved with the implementation of RRAs. Wright, Killian, DXA Drafting Quality Improvement 15 Johnston, et al. (2008) found that RRAs could spend more time with patients enhancing communication and patient education, which reduced post-procedure complications.19 A separate study involving preliminary image interpretation by an experienced x-ray technologist, called a Radiology Extender (RE), was conducted in 2018 by Borthakur, Kneeland, and Schnall (2018). The study found that prereads of musculoskeletal images by the RE improved the radiologist's workflow rate significantly. The authors reported that the RE's current workflow in their study translated to an additional 85 minutes of free time for the radiologist during a workday. Borthakur, Kneeland, and Schnall also reported, "Anecdotal evidence from surveying all six radiologists that participated in this study revealed that there was reduced staff fatigue from reading mundane radiographs using a RE." 20 The Future of DXA Quality DXA education and ISCD certification for technologists, interpreters, and clinics have done little to improve DXA accuracy over the past 30 years. Studies involving reporting templates and electronic data importing have proven to reduce some DXA interpretation errors successfully. However, as stated by Krueger, Shives, Sigilinski, et al., "additional studies evaluating ways to improve scan acquisition and analysis are warranted." 13 The versatility and flexibility of radiology physician extenders, as intermediaries between technologists and radiologists, makes them logical candidates to promote radiology related quality assurance. Future studies may demonstrate that preliminary DXA interpretation by physician extenders can improve all areas of DXA quality while also reducing the radiologists' workload. DXA Drafting Quality Improvement 16 Purpose The proposed quality assurance project was designed to show that resource allocation and process improvements are equally as integral as education and training relative to DXA interpretation accuracy. DXA exams were retrospectively evaluated to measure the effect of different interpretation methods on DXA report accuracy. The study's goal was to show that preliminary DXA drafting by radiology physician extenders promotes exam quality and report accuracy. Dual-energy X-ray absorptiometry (DXA) has remained the "Gold standard" test for measuring bone mineral density for more than three decades.1 Accurate DXA scan results help clinicians treat and monitor fracture risks related to low bone density and osteoporosis.13 Unfortunately, the prevalence of errors associated with DXA acquisition and interpretation is alarmingly high and leads to potentially inappropriate or harmful treatments for patients. 1,5-8,11-13,15 The International Society for Clinical Densitometry (ISCD) published set guidelines for DXA technologists and interpreters to ensure "high-quality musculoskeletal health assessments in the service of superior patient care." The ISCD also offers certification courses for DXA technologists (Certified Bone Densitometry Technologists, CBDT) and interpreters (Certified Clinical Densitometrists, CCD). The CCD exam is available to "physicians, certified nurse practitioners, certified physician assistants, fellows, residents, and Ph.D.'s. performing research in the field of bone density." 21 X-ray technologists, Radiology Practitioner Assistants (RPA), and Registered Radiologist Assistants (RRA) are not eligible to take the CCD course or exam. Regardless of this possible missed opportunity to promote quality by the ISCD, there are no regulations in the United States requiring DXA interpreters to meet specific qualifications.15 DXA Drafting Quality Improvement 17 Some might view this lack of control as problematic, but Radiology Imaging Associates (RIA) embraced the regulatory gap as an opportunity to improve workflow, DXA quality, and accuracy. In response to increasing DXA exam volumes and noticeable report drafting delays, Radiology Imaging Associates' (RIA) Clinical Sciences Specialist began training physician extenders to draft preliminary DXA reports in the fall of 2016. The training and drafting process started as an informal tutorial and slowly evolved as the drafting team grew. Today, RIA has twelve informally trained non-physician DXA interpreters drafting over 300 preliminary reports weekly for radiologist approval. One-hundred percent of DXA exams are assigned to RIA radiologists upon draft completion by non-physician contributors, otherwise known as DXA drafters. The drafting team is responsible for detecting technologist errors related to exam acquisition and following up on error resolution. Preliminary DXA drafting by physician extenders has provided radiologists more time to focus on critical exams and has eliminated reporting delays. Theoretical framework This study's theoretical framework incorporated the institutional model of organizational theory and total quality management (TQM) approach. The institutional model of organizational theory suggests that organizational change stems from the desire to achieve legitimacy according to industry standards and regulations. As quality standards are embraced by regulating bodies and competing entities, interindustry pressure to conform rises.22 In the healthcare setting, TQM is interchangeable with continuous quality improvement (CQI) and similarly influences organizational change. According to Hughes, "CQI has been used as a means to develop clinical practice and is based on the principle that there is an opportunity DXA Drafting Quality Improvement 18 for improvement in every process and on every occasion."23 As it relates to TQM and CQI, the plan do study act (PDSA) model for quality improvement has been successfully implemented by the institute for healthcare improvement and will serve as the model for this project. The PDSA cycle begins with problem identification and continues with the implementation of a process change. Results of the process change are then studied, and actions are taken to build on the findings. The cycle starts over with a new plan for continuous quality improvement.23 Research and prior quality improvement studies suggest that DXA errors are common and that patients could benefit from CQI. 1,5-8,11-13,15 Quantitative analysis of retrospective DXA reports defined the study step of the PDSA cycle. The results of data analysis determined how different interpretation methods related to the prevalence of DXA interpretation errors Purpose Statement The purpose of this study was to measure the number of analysis and interpretation errors that have resulted from two different methods of DXA interpretation. Retrospective DXA reports drafted by radiologists in 2015 and physician extenders in 2019 were reviewed to determine the number of studies with at least one error, analysis errors, or BMD change errors. Proposed Research Questions (RQ) RQ1: Does preliminary DXA drafting by physician extenders promote error reduction and interpretation accuracy? RQ2: How many analysis errors were present in 2015 vs. 2019? RQ3: How many BMD change errors were present in 2015 vs. 2019? Methods This quantitative study used a descriptive research design to examine the effect of different interpretation methods on error prevalence in DXA reports. According to Ravid, DXA Drafting Quality Improvement 19 "descriptive research is aimed at studying a phenomenon as it is occurring naturally, without any manipulation or intervention."24 The retrospective nature of this research eliminated the possibility of variable manipulation. The study evaluated DXA reports generated by two different interpretation methods, radiologist interpretation and preliminary drafting by physician extenders for approval by a radiologist. As part of a quality assurance strategy, project findings were used to determine if one interpretation method significantly changed report accuracy. Sample Overview Nuance mPower Clinical Analytics software was used to select retrospective DXA reports from 2015 and 2019. The filtered search capability of mPower Clinical Analytics software allows users to examine large volumes of stored data to find phrases in DXA reports such as "follow up exam."25 Identical search filters were used to determine the population size for exams performed in 2015 and 2019. The active filters included random sorting, the phrase "bone density compared with" in the search field, a start date of January 1, the end date of December 31, the start age of 19 years, OT modality, and inclusion of 22 of 23 Invision Sally Jobe (ISJ) sites. DXA exams at the Golden/ Lakewood location were excluded for consistency purposes because it is the only ISJ site to utilize GE DXA technology. Technologists use Hologic DXA technology at all other ISJ locations. Filtered Nuance mPower searches found 8,702 exams for 2019 and 6027 exams for 2015. The formula used to calculate the sample size of an infinite population was adjusted to determine sample sizes needed for 2015 and 2019 to ensure a confidence level of 95% and a margin of 5% error.26 Sample size calculations determined that 362 exams from the 2015 population and 369 DXA Drafting Quality Improvement 20 exams from the 2019 population should be included in the study to achieve a 95% confidence level and 5% margin of error. The number of reviewed exams for this study was reduced to 400 due to time constraints. Data sets consisted of 200 exams interpreted by nine different radiologists in 2015 and 200 exams interpreted by 11 different physician extenders in 2019. The total population of 400 exams is comparable to other DXA related studies. In the published article titled "Prevalence and type of errors in dual-energy x-ray absorptiometry," Messina et al. entered 485 patients in their dataset.12 Krueger et al. 's study "DXA Errors are Common and Reduced by Use of a Reporting Template," included 498 patients in their research project.13 Rand et al. enrolled 144 postmenopausal women in their investigation of the "Impact of Spinal Degenerative Changes on the Evaluation of Bone Mineral Density with Dual Energy X-ray Absorptiometry (DXA)." 9 Instrumentation ISCD guidelines and common errors found during previous research studies were implemented into the data collection tool's design for this study. For example, Krueger et al. 's DXA study reported 70% of exams contained errors related to " incorrect interpretation of BMD change."13 Messina et al. 's research discovered analysis errors in 79% of reviewed DXA exams.12 Inspired by the studies above, this study's error categories included analysis errors and BMD change errors. A third category was added to document the number of exams containing errors from each data set. Data was collected and organized on a tally sheet using a nominal measurement scale to score errors (Fig. 1). The horizontal axis of the tally sheet included two categories of potential DXA errors. The first category was titled analysis errors and consisted of any positioning, analysis, or comparison errors made by the DXA technologist during exam acquisition. Incorrect DXA Drafting Quality Improvement 21 region of interest (ROI) mapping or patient positioning of the hip, spine, and forearm were included in the analysis error category. Category two, BMD change errors, had significant change reporting despite the presence of an analysis error, reporting no significant change when significant changes occurred, or incorrectly documenting a significant difference in BMD. The number of exams with one or more errors was the third column on the tally sheet's horizontal axis. Study numbers defined the tally sheets' vertical axis, and each number corresponded to an exam in one of the original raw data sets. Filtered mPower searches returned 1,000 random DXA exams, per data set, in spreadsheet format, accommodating a vertical column of study numbers matching the tally sheet. Study numbers made data deidentification possible and ensured patient privacy during the data collection process. A final expanded version of the tally sheet included columns for drafter identification and subcategories of analysis and BMD change errors. Analysis errors were sorted by ROI, and BMD change errors were divided into mixed trend errors and other BMD change errors. Figure 1: Final Data Collection Tally Sheet Sample DXA Drafting Quality Improvement 22 Data Analysis Categorical data from reports drafted by different interpretation methods, radiologist drafting, and preliminary drafting by radiology physician extenders was analyzed using a chi-squared test for homogeneity. Wilhelm Kirch described the formulas for calculating statistical tests of chi-squared homogeneity and independence as being identical. Kirch claims, "the difference between these two tests consists of stating the null hypothesis, the underlying logic, and the sampling procedures." 27 This study's null hypothesis stated DXA errors were equally prevalent in reports drafted by two different interpretation methods. Rejection of the null hypothesis suggested that one interpretation method produces significantly fewer errors. Results The chi-squared test of independence with a significance level of 5% was used to determine if error prevalence in different categories is dependent on a specific DXA interpretation method. Tally sheet data, summarized in (Table 1, Fig. 2), was analyzed using the IBM SPSS statistics version 24 (IBM, Corp., Armonk, NY, USA), and 2015 error categories were compared to respective 2019 error categories. The independent variables in this study were interpretation methods, 2015 radiologist interpretation, and 2019 PE interpretation. Interpretation methods were represented by years in crosstabulation charts (Table 2, Table 4, and Table 6). Table 1: Tally Sheet Summary of Categorical Errors Resulting from Two Interpretation Methods Tally Sheet Summary Analysis Errors BMD Change Errors Studies with Errors n % n % n % Radiologist Interpretation (2015) 72 36 47 23.5 79 39.5 PE Interpretation (2019) 45 22.5 29 14.5 60 30 DXA Drafting Quality Improvement 23 Figure 2: Percentage of Errors According to Interpretation Methods The first (2x2) Chi-Square test showed no significant association between the number of analysis errors produced by radiologist interpretation 72/200 (36%) and analysis errors produced by PE’s 45/200 (22.5%), (2 (1, N = 200) = 0.98, P = .323 (Table 3). No significant relationship was found between BMD change errors in radiologist interpretations 47/200 (23.5%) and PE interpretations 29/200 (14.5%), (2 (1, N = 200) = 1.78, P = .182 (Table 5). The final Chi-Square analysis of studies with errors by radiologists 79/200 (39.5%) and PE’s 60/200 (30%), also found no significant correlation, (2 (1, N = 200) = 0.049, P = .825 (Table 7). The above findings suggest the null hypothesis must be accepted. There is no significant association between DXA interpretation methods and number of errors found in reports. 36 % 23.5 % 39.5 % 22.5 % 14.5 % 30 % 0 5 10 15 20 25 30 35 40 45 Analysis Errors Analysis Errors Analysis ErrorsAnalysis Errors Analysis Errors Analysis Errors Analysis Errors BMD Change Errors BMD Change Errors BMD Change Errors BMD Change Errors BMD Change Errors BMD Change Errors BMD Change Errors Studies w/ Errors Studies w/ ErrorsStudies w/ Errors Studies w/ ErrorsStudies w/ Errors Studies w/ ErrorsStudies w/ ErrorsStudies w/ Errors Studies w/ Errors PE Interpretation (2019) PE Interpretation (2019) PE Interpretation (2019)PE Interpretation (2019) PE Interpretation (2019) PE Interpretation (2019) PE Interpretation (2019) PE Interpretation (2019) PE Interpretation (2019)PE Interpretation (2019) PE Interpretation (2019) Radiologist Interpretation (2015) Radiologist Interpretation (2015) Radiologist Interpretation (2015) Radiologist Interpretation (2015) Radiologist Interpretation (2015) Radiologist Interpretation (2015) Radiologist Interpretation (2015) Radiologist Interpretation (2015) Radiologist Interpretation (2015) Radiologist Interpretation (2015)Radiologist Interpretation (2015) Radiologist Interpretation (2015)DXA Drafting Quality Improvement 24 Table 2: Analysis Errors Crosstabulation Table 3: Analysis Errors Chi-Squared Tests DXA Drafting Quality Improvement 25 Table 4: BMD Change Errors Crosstabulation Table 5: BMD Change Errors Chi-Squared Tests DXA Drafting Quality Improvement 26 Table 6: Studies with Errors Crosstabulation Table 7: Studies with Errors Chi-Squared Tests DXA Drafting Quality Improvement 27 According to the tally sheet summary (Table 1), 19/200 (9.5%) more studies contained errors in 2015 than 2019, and analysis errors were more prevalent than BMD change errors in both years. Greater than 50% of analysis errors involved hip ROI in both sample sets (Fig. 3). Lumbar spine ROI errors were the second most prevalent type of analysis error, followed by forearm ROI analysis. BMD change errors were divided into two categories, incorrectly reported mixed trend, and all other BMD change errors (Table 8, Fig. 4). Incorrectly interpreted mixed accounted for 20/30 (67%) of BMD change errors in 2019 and 11/47 (23%) of BMD change errors in 2015 (Fig.4). Other BMD change errors included incorrectly reporting a BMD change when an analysis error was present and providing a statistically significant statement for comparison exams performed at an outside facility. Other BMD change errors were found in 10/30 (33%) of PE interpretations and 36/47 (77%) of radiologist interpretations. Table 8: Types of Analysis Errors & BMD Change Errors According to Interpretation Methods Radiologist Interpretation PE Interpretation 2015 2019 Analysis Error Tally n % n % ROI - Hip 52 57.8 36 65.5 ROI - Lumbar spine 26 28.9 11 20.0 ROI -Forearm 12 13.3 8 14.5 Total Analysis Errors 90 55 BMD Change Tally n % n % Incorrectly Reported Mixed Trend 11 23.4 20 66.7 All other BMD Change Errors 36 76.6 10 33.3 Total BMD Change Errors 47 30 DXA Drafting Quality Improvement 28 Figure 3: Comparison of Analysis Errors categorized by ROI Figure 4: Comparison of BMD Change Errors in 2015 and 2019 Data Sets DXA Drafting Quality Improvement 29 Figure 5: Percentage of 200 Reviewed Exams Drafted by Radiologists and PE's Figure 5. Percentage of 200 Reviewed Exams Drafted by Radiologists and PE's Exam distribution between interpreters is demonstrated in Figure 5. One radiologist dictated 127/200 (63.5%) exams, and the one PE drafted 55/200 (27.5%) exams. The remaining 73/200 (36.5%) of radiologist drafted exams were distributed in single-digit percentages between 8 radiologists. Figure 5 shows a more balanced distribution of the remaining 145/200 (72.5%) exams interpreted by ten physician extenders. DXA Drafting Quality Improvement 30 Discussion This study indicates that DXA errors are equally prevalent in reports interpreted by radiologists and physician extenders. RIA radiologists would likely praise physician extenders' level of accuracy because the implementation of preliminary DXA drafting has provided radiologists more time to focus on critical exams. However, time-saving efforts to help radiologists did not improve accuracy significantly. Analysis and BMD change errors were persistent and consistent with studies by Karahan et al.11, Messina et al.12, and Krueger et al.13. Findings also reinforced Krueger et al.'s sentiment that "additional studies evaluating ways to improve scan acquisition and analysis are warranted."13 In the analysis error category, hip ROI mapping errors were the most prevalent, followed by spine ROI errors. Hip ROI error frequency was possibly magnified by the study filter requirements ensuring comparison exams for all studies. For example, serial exams of the hip were frequently found to have been analyzed differently from one year to the next. ROI mapping dimensions matched prior exams for some studies, but occasionally, the dimensions varied by more than 5, especially the dimension corresponding to the femoral shaft length. DXA technologists were unable to provide information about ROI dimensions despite referencing the Hologic DXA manual. When asked about ROI matching of comparison exams, two RIA DXA technologists confirmed they were taught to "eyeball" the prior exam and try to match the ROI visually. Further inquiry about ROI dimensions led to email responses from Hologic Application Support Specialist Laura Grenier. According to Laura, ROI dimensions correspond to data pixels. She explained that DXA technologists and interpreters are responsible for determining if DXA Drafting Quality Improvement 31 ROI mapping is correct and if the exam should be auto compared to the most recent exam or the baseline exam. Unfortunately, this study's retrospective nature did not permit exam reanalysis to determine the significance of erroneous ROI mapping. It seems reasonable to assume the inclusion of 16 additional data pixels on one of two comparison exams would lead to inaccurate BMD change findings, as likely demonstrated in figure 6. ROI mapping errors were also present in spine and forearm exams, which also have data pixel dimensions provided with images. Perhaps data pixel matching could help technologists more accurately map ROI for comparison exams? Scan Date BMD T-Score BMD Change (g/cm2) vs Previous 2015 0.838 -0.9 0.036 (4.6%) 2012 0.801 -1.2 -0.021 (-2.6%) Figure 6: Comparison Hip Images: from the same patient obtained in 2015 and 2012. Blue arrows correlate with data pixels included in the vertical ROI dimension. Compared to 2012, 16 additional data pixels of length are included in the 2015 analysis. Results of the scan show the patient’s BMD has increased by 0.036 g/cm2 (4.6%). For this facility 0.028g/cm2 is statistically significant. This increase is likely artifactual due to erroneous inclusion of 16 data pixels of femoral shaft length in 2015 ROI mapping. DXA Drafting Quality Improvement 32 Incorrect use of the mixed trend macro was a running theme in exams interpreted by radiologists and PEs. Four DXA interpreters, two radiologists, and two PEs were responsible for all of the mixed trend errors found during this study. One radiologist made 9 of 11 mixed trend mistakes, and one PE was responsible for 19 of 20 similar errors. This error type can quickly be resolved by merely informing the drafters that BMD change is only reported when it exceeds the established least significant change (LSC) value for the specific ROI.15 A particular pattern was not identified among radiologists or PEs for the occurrence of BMD change errors. Recommendations Future studies are recommended in the area of DXA analysis and data pixel ROI matching. For starters, a research project is needed to test the threshold of data pixel error. Five data pixels was arbitrarily chosen as the error threshold for this study; however, testing is required for accuracy purposes. Hip ROI errors for this study may have been inflated due to unknown data pixel ROI error ranges. Therefore, an analysis of inconsistent data pixel comparison exams is recommended before a version of this project is repeated. Limitations The critical limitations of this study were time and human resources. Data review help from experienced DXA drafters or additional time would have allowed for a review of the intended 731 DXA exams. At a review rate of approximately ten exams an hour, one researcher would need around 73 hours to review the sample size required to achieve a 95% confidence level and 5% margin of error for this study. A future version of this project should involve a team of DXA drafters to assist with data review. Additional research time and drafting assistance would have also made instrument validation a realization rather than a far-reaching goal. DXA Drafting Quality Improvement 33 Therefore, the inability to prove instrument validity and reliability is another limitation of this project. Conclusion Research has shown that DXA errors are common, and very few studies have been done to test improvement methods.13 This study provided insight into a process improvement method that has yet to be investigated or measured. Different interpretation methods do not significantly reduce DXA report errors. However, this study demonstrated that PE interpretations contained fewer errors in all reported categories than reports drafted by radiologists. This finding suggests that the ISCD should consider credentialing pathways for advanced radiologic technologists to become certified DXA interpreters. The benefits of DXA drafting by physician extenders have not been fully realized. Additional training and regular quality assurance would benefit the entire team of RIA DXA drafters and DXA technologists. Periodic review of retrospective DXA exams is necessary to pinpoint patterns of error that can easily be corrected. PEs can also improve DXA quality by calling technologists and requesting corrections when ROI mapping errors are discovered. PEs should not settle for being as accurate as radiologists when they can eventually be statistically better. DXA Drafting Quality Improvement 34 References 1. Lewiecki EM, Binkley N. DXA: 30 years and counting: Introduction to the 30th anniversary issue. Bone. 2017;104:1-3. doi:10.1016/j.bone.2016.12.013 2. Miller PD. The history of bone densitometry. Bone. 2017;104:4-6. doi:10.1016/j.bone.2017.06.002 3. Blake GM, Fogelman I. The role of DXA bone density scans in the diagnosis and treatment of osteoporosis. Postgrad Med J. 2007;83(982):509-517. doi:10.1136/pgmj.2007.057505 4. Lewiecki EM, Binkley N, Bilezikian JP. Stop the war on DXA! Ann N Y Acad Sci. 2018;1433:12-17. doi:10.1111/nyas.13707 5. Lewiecki EM, Lane NE. Common mistakes in the clinical use of bone mineral density testing. Nat Clin Pract Rheumatol. 2008. doi:10.1038/ncprheum0928 6. Morgan SL, Prater GL. Quality in dual-energy X-ray absorptiometry scans. Bone. 2017;104:13-28. doi:10.1016/j.bone.2017.01.033 7. Watts NB. Fundamentals and pitfalls of bone densitometry using dual-energy X-ray absorptiometry (DXA). Osteoporos Int. 2004;15(11):847-854. doi:10.1007/s00198-004-1681-7 8. El Maghraoui A, Roux C. DXA scanning in clinical practice. Qjm. 2008;101(8):605-617. doi:10.1093/qjmed/hcn022 9. Rand T, Seidl G, Kainberger F, et al. Impact of Spinal Degenerative Changes on the Evaluation of Bone Mineral Density with Dual Energy X-ray Absorptiometry (DXA). Calcif Tissue Int. 1997;60(5):430-433. doi:10.1007/s002239900258 DXA Drafting Quality Improvement 35 10. Tenne M, McGuigan F, Besjakov J, Gerdhem P, Åkesson K. Degenerative changes at the lumbar spine - Implications for bone mineral density measurement in elderly women. Osteoporos Int. 2013;24(4):1419-1428. doi:10.1007/s00198-012-2048-0 11. Karahan AY, Kaya B, Kuran B, et al. Common Mistakes in the Dual-Energy X-ray Absorptiometry (DXA) in Turkey. A Retrospective Descriptive Multicenter Study. Acta medica (Hradec Kral. 2016;59(4):117-123. doi:10.14712/18059694.2017.38 12. Messina C, Bandirali M, Sconfienza LM, et al. Prevalence and type of errors in dual-energy x-ray absorptiometry. Eur Radiol. 2015;25(5):1504-1511. doi:10.1007/s00330-014-3509-y 13. Krueger D, Shives E, Siglinsky E, et al. DXA Errors Are Common and Reduced by Use of a Reporting Template. J Clin Densitom. 2019;22(1):115-124. doi:10.1016/j.jocd.2018.07.014 14. Wachsmann J, Blain K, Thompson M, Cherian S, Oz OK, Browning T. Electronic Medical Record Integration for Streamlined DXA Reporting. J Digit Imaging. 2018;31(2):159-166. doi:10.1007/s10278-017-0023-1 15. Lewiecki EM, Binkley N, Morgan SL, et al. Best Practices for Dual-Energy X-ray Absorptiometry Measurement and Reporting: International Society for Clinical Densitometry Guidance. J Clin Densitom. 2016;19(2):127-140. doi:10.1016/j.jocd.2016.03.003 16. Carey JJ, Delaney MF. Utility of DXA for monitoring, technical aspects of DXA BMD measurement and precision testing. Bone. 2017;104:44-53. doi:10.1016/j.bone.2017.05.021 DXA Drafting Quality Improvement 36 17. Czum JM. Radiologist Burnout: Mission Accomplished. J Am Coll Radiol. 2019;16(10):1506-1508. doi:10.1016/j.jacr.2019.06.021 18. Sanders VL, Flanagan J. Radiology physician extenders: A literature review of the history and current roles of physician extenders in medical imaging. J Allied Health. 2015;44(4):219-224. 19. Wright DL, Killion JB, Johnston J, et al. RAs_increase_productivity.PDF. Radiol Technol. 2008;79(4):365-370. 20. Borthakur A, Kneeland JB, Schnall MD. Improving Performance by Using a Radiology Extender. J Am Coll Radiol. 2018;15(9):1300-1303. doi:10.1016/j.jacr.2018.03.051 21. The International Society for Clinical Densitometry (2019) Official positions of the International Society for Clinical Densitometry. https://iscd.org/learn/official-positions/adult-positions/. Accessed December 1, 2020. 22. Shojania KG, McDonald KM, Wachter RM, Owens DK. Closing The Quality Gap: A Critical Analysis of Quality Improvement Strategies, Volume 1—Series Overview and Methodology. Technical Review 9 (Contract No. 290-02-0017 to the Stanford University–UCSF Evidence- based Practices Center). AHRQ Publication No. 04-0051-1. Rockville, MD: Agency for Healthcare Research and Quality. August 2004. 23. Hughes R. Section VI: Tools for Quality Improvement and Patient Safety. Patient Saf Qual An Evidence-Based Handb Nurses. 2008;3:3-3, 3-4. doi:S0398-7620(11)00211-2 [pii]\n10.1016/j.respe.2011.01.003 [doi] 24. Ravid R. Practical Statistics For Educators. 5th ed. Lanham, Maryland: Rowman & Littlefield; 2015. DXA Drafting Quality Improvement 37 25. mPower Quantitative Findings. Nuance website https://www.nuance.com/content/dam/nuance/en_us/collateral/healthcare/data-sheet/ds-mpower-quantitative-findings-analysis-en-us.pdf. Updated May 2018. Accessed March 28, 2020 26. Kumar GNS. How to determine the Sample Size? [VIDEO] YouTube. https://www.youtube.com/watch?v=51NS0cGjBIk. Published August 19, 2017. Accessed March 8, 2020. 27. Kirch W, ed. Test of Homogeneity, Chi-SquareTest of homogeneity, chi-square. In: Encyclopedia of Public Health. Dordrecht: Springer Netherlands; 2008:1386. doi:10.1007/978-1-4020-5614-7_3475 DXA Drafting Quality Improvement 38 Appendix A. DXA Drafting Quality Improvement 39 |
Format | application/pdf |
ARK | ark:/87278/s6amhwy0 |
Setname | wsu_smt |
ID | 96822 |
Reference URL | https://digital.weber.edu/ark:/87278/s6amhwy0 |