H e a l t h C a r e Po l i c y a n d Q u a l i t y • O r i g i n a l R e s e a r c h Rosenkrantz et al. Diagnostic Impact of Repeat Musculoskeletal and Chest Radiographs Health Care Policy and Quality Original Research

Technologist-Directed Repeat Musculoskeletal and Chest Radiographs: How Often Do They Impact Diagnosis? Andrew B. Rosenkrantz1 Jill E. Jacobs Nidhi Jain Geraldine Brusca-Augello Michael Mechlin Marc Parente Michael P. Recht American Journal of Roentgenology

Rosenkrantz AB, Jacobs JE, Jain N, et al.

Keywords: quality improvement, radiation dose, radiography, technologists DOI:10.2214/AJR.17.18030 Received February 1, 2017; accepted after revision April 7, 2017. Based on a presentation at the Society of Skeletal Radiology 2016 annual meeting, New Orleans, LA. 1

All authors: Department of Radiology, Center for Biomedical Imaging, NYU School of Medicine, NYU Langone Medical Center, 660 First Ave, 3rd Fl, New York, NY 10016. Address correspondence to A. B. Rosenkrantz ([email protected]).

AJR 2017; 209:1–5 0361–803X/17/2096–1 © American Roentgen Ray Society

OBJECTIVE. Radiologic technologists may repeat images within a radiographic examination because of perceived suboptimal image quality, excluding these original images from submission to a PACS. This study assesses the appropriateness of technologists’ decisions to repeat musculoskeletal and chest radiographs as well as the utility of repeat radiographs in addressing examinations’ clinical indication. MATERIALS AND METHODS. We included 95 musculoskeletal and 87 chest radiographic examinations in which the technologist repeated one or more images because of perceived image quality issues, rejecting original images from PACS submission. Rejected images were retrieved from the radiograph unit and uploaded for viewing on a dedicated server. Musculoskeletal and chest radiologists reviewed rejected and repeat images in their timed sequence, in addition to the studies’ remaining images. Radiologists answered questions regarding the added value of repeat images. RESULTS. The reviewing radiologist agreed with the reason for rejection for 64.2% of musculoskeletal and 60.9% of chest radiographs. For 77.9% and 93.1% of rejected radiographs, the clinical inquiry could have been satisfied without repeating the image. For 75.8% and 64.4%, the repeated images showed improved image quality. Only 28.4% and 3.4% of repeated images were considered to provide additional information that was helpful in addressing the clinical question. CONCLUSION. Most repeated radiographs (chest more so than musculoskeletal radiographs) did not add significant clinical information or alter diagnosis, although they did increase radiation exposure. The decision to repeat images should be made after viewing the questionable image in context with all images in a study and might best be made by a radiologist rather than the performing technologist. urrent digital radiography systems allow the technologist to quickly and easily repeat individual images that the technologist considers to be suboptimal [1]. The rejected images are typically not stored on the PACS or ever seen by the radiologist, who thus has no awareness that these images were obtained. On one hand, technologists are encouraged to assess the quality of their images and to ensure that optimal studies are obtained for subsequent interpretation by a radiologist. On the other hand, the additional images are a source of further radiation and prolonged examination time [1], such that discretion is warranted regarding the frequency of obtaining repeat images. The current radiology workflow whereby radiologists will see suboptimal images that are not repeated, although they will be unaware of those that are repeated, encourages technologists to err

C

on the side of caution and simply repeat a radiograph when in doubt, potentially leading to high rates of repeated radiographs. Nonetheless, a significant fraction of repeated radiographs may in fact have been of diagnostic quality [2]. Musculoskeletal imaging and chest imaging represent two areas of radiology in which radiography plays a front-line role in directing patient management. The aim of this study is to assess the appropriateness of technologists’ decisions to repeat musculoskeletal and chest radiographic examinations as well as the utility of the repeat radiographs in addressing the examination’s clinical indication. Materials and Methods

This retrospective HIPAA-compliant study was approved by our institutional review board, with a waiver given for the requirement for written informed consent.

AJR:209, December 2017 1

American Journal of Roentgenology

Rosenkrantz et al. As a result of an earlier quality improvement initiative (Jacobs JE, et al., presented at the Radiological Society of North America 2015 annual meeting) conducted at our institution, technologists record in a dedicated log for each unit a reason for all repeated radiographs. A departmental information technology manager reviewed the logs to identify 100 musculoskeletal radiographic examinations and 100 chest radiographic examinations of adult patients for which the technologist elected to repeat one or more images because of perceived image quality issues, rejecting the original image from formal PACS submission. For all examinations for which the rejected images were available on the console at the time of log review, the manager retrieved the rejected images from the radiograph unit and uploaded them to a dedicated server to allow subsequent viewing by radiologists. The musculoskeletal radiographs were obtained in February 2015 from eight radiograph units located at an outpatient musculoskeletal imaging center. The chest radiographs were obtained during October through December 2015 from eight inpatient radiograph units at an affiliated hospital. The chest radiographs were obtained intermittently during this longer period, given the hospital’s more remote location. A musculoskeletal radiologist and a chest radiologist retrospectively reviewed the rejected images and the repeat images in their timed sequence for the musculoskeletal and chest examinations, respectively, in addition to the remaining images obtained as part of each study. The images were viewed using a standard PACS workstation. The radiologists were also provided with the reason recorded by the technologist for repeating the image. Examinations were excluded if the rejected image could not be retrieved from the server or if the reason for rejection entered by the technologist was overly general (e.g., error), resulting in a final cohort of 95 musculoskeletal radiographs (including the cervical spine [n = 4], thoracic spine [n = 3], lumbar spine [n = 9], sacrum or coccyx [n = 4], shoulder [n = 13], humerus [n = 1], hand [n = 1], pelvis [n = 18], hip [n = 17], femur [n = 3], knee [n = 19], ankle [n = 2], and foot [n = 1]) and 87 chest radiographs (including 62 portable examinations). The radiologists assessed each case for the following features: agreement with the provided rea-

son for rejecting the image, whether the study would have addressed the given clinical question without repeating the image, whether the repeat image showed improved image quality compared with the rejected image, and whether the repeat image provided additional information that helped answer the clinical question. In addition, for both musculoskeletal and chest examinations, the total radiation exposure was computed for all repeat images, for only those repeat images for which the radiologist agreed with the reason for repeating, and for only those repeat images for which the radiologist indicated that the clinical inquiry could not be addressed based on the original images. This was determined by multiplying the number of individual repeat images for each examination by the radiation exposure associated with the given exposure, on the basis of published dose estimates [3, 4]. Given potential bias related to the readers being informed that all examinations contained rejected images, an additional analysis was performed at least 6 months after the previously described evaluation. The musculoskeletal and chest radiologists each reviewed a set of 50 radiographs: 25 individual rejected radiographs that were selected at random from the aforementioned sets of cases and 25 individual radiographs randomly selected from the departmental PACS that had not been repeated at the time of the initial examination. For the musculoskeletal radiographs, the distributions of specific examination types were matched between the two groups (e.g., the same number of repeated and nonrepeated knee radiographs). The rejected and nonrejected radiographs were intermixed in a random order, and the radiologists were blinded as to the group to which each image belonged. The radiologists were asked whether, from an image quality standpoint, they would have advised that the image be repeated. The radiologists’ decisions regarding whether to repeat each image were compared with the technologists’ decisions. All results were summarized in descriptive fashion using software (Excel for Windows 2010, Microsoft).

Results The most common reasons for repeat musculoskeletal radiographs, as reported by tech-

nologists, were as follows: anatomy cutoff (n = 34), patient positioning (n = 21), obstructed view (n = 11), rotation (n = 9), centering of the tube or grid (n = 6), and motion (n = 6). The most common reasons for repeat chest radiographs were as follows: anatomy cutoff (n = 59), obstructed view (n = 8), rotation (n = 6), and motion (n = 5). All other reasons for either musculoskeletal or chest radiographs were provided for fewer than five examinations. The reviewing radiologist agreed with the provided reason for rejection for 64.2% of musculoskeletal and 60.9% of chest radiographs (Table 1). For 77.9% and 93.1% of rejected radiographs, the clinical inquiry could have been satisfied without repeating the image. For 75.8% and 64.4% of all rejected musculoskeletal and chest images, respectively, the repeat images showed improved image quality. Only 28.4% and 3.4% of repeat images were considered to provide additional information that was helpful in addressing the clinical question. For chest radiographs, the estimated total radiation exposure associated with all repeat radiographs was 2.24 mSv, exposure associated only with radiographs for which the radiologist agreed with the reason for repeating the image was 1.27 mSv, and exposure associated only with radiographs for which the radiologist thought that the clinical inquiry could not be addressed based on the original images was 0.14 mSv. For musculoskeletal radiographs, the estimated total radiation exposure associated with all repeat radiographs was 25.1 mSv, exposure associated only with radiographs for which the radiologist agreed with the reason for repeating the image was 11.3 mSv, and exposure associated only with radiographs for which the radiologist thought that the clinical inquiry could not be addressed based on the original images was 5.4 mSv. When blinded to whether individual radiographs were or were not repeated by the technologist, the musculoskeletal radiologist indicated that 56% of repeated radiographs should have been repeated from an image quality standpoint, compared with none of the nonrepeated radiographs. The chest ra-

TABLE 1: Responses to Questions Used in Assessment of Repeat Radiographs by Radiologists If Answer to Prior Question Was Yes, Was This Based on Rejected Image (as Opposed to Other Images)?

Agree With Reason for Rejection?

Could Clinical Question Be Answered Without Repeating Image?

Musculoskeletal

64.2 (61/95)

77.9 (74/95)

58.1 (43/74)

75.8 (72/95)

28.4 (27/95)

Chest

60.9 (53/87)

93.1 (81/87)

100.0 (81/81)

64.4 (56/87)

3.4 (3/87)

Radiograph Type

Did Repeat Image Provide Did Repeat Image Show Additional Information to Aid Improved Image Quality? Diagnosis?

Note—Data are percentage of radiographs (no. of radiographs/total radiographs).

2

AJR:209, December 2017

Diagnostic Impact of Repeat Musculoskeletal and Chest Radiographs

American Journal of Roentgenology

Fig. 1—54-year-old man with prior knee arthroplasty who underwent radiography to assess for hardware loosening. A, Initial radiograph excludes inferior aspect of tip of tibial hardware component. Technologist rejected image. B, Repeat radiograph shows hardware in its entirety and was deemed to improve diagnostic confidence.

diologist indicated that 52% of repeated radiographs should have been repeated from an image quality standpoint, compared with none of the nonrepeated radiographs. Figures 1 and 2 show representative musculoskeletal radiographs, and Figures 3 and 4 show representative musculoskeletal and chest radiographs. Discussion We assessed outcomes related to technologist-directed repeat musculoskeletal and chest

A

B

radiographs. The radiologist frequently did not agree with the reason for repeating the image, and the clinical question frequently could have been answered even without repeating the image. The repeat image showed better image quality than the rejected image in 64– 76% of cases and provided additional information that aided the diagnosis in only 3–28% of cases. Moreover, when shown repeated and nonrepeated images in a blinded fashion, the radiologists identified an image quality issue with only approximately half of the repeated

images. Collectively, the findings raise concern regarding technologist-directed repeat radiographs as a source of increased radiation exposure without necessarily concomitant improved diagnostic value. Past studies have also evaluated a range of outcomes related to technologist-directed repeat radiographs. However, such earlier studies have largely focused on determining the repeat radiograph rate among all examinations at an institution [5–9], stratifying repeats by an array of variables (e.g., body region, im-

A

B

Fig. 2—71-year-old woman with hip pain who underwent pelvic radiography. A, Initial radiograph excludes proximal aspect of femoral diaphyses. Technologist rejected image. B, Repeat radiograph shows greater extent of proximal femurs. Repeat image was deemed not to improve diagnostic yield.

AJR:209, December 2017 3

Rosenkrantz et al.

American Journal of Roentgenology

A

B

Fig. 3—82-year-old man in ICU who underwent postoperative chest radiography. A, Initial radiograph excludes upper abdomen. Technologist rejected image. B, Repeat radiograph shows upper abdomen. Repeat image was deemed not to improve diagnostic yield.

aging system, and reason for repeating the image) [1, 10, 11], and developing solutions for facilitating repeat radiograph analysis [12–15]. To our knowledge, a detailed case-by-case analysis of the actual rejected and repeated images has not previously been performed in the fashion we currently present. This unique

approach is critical for highlighting the reality that such repeat images may in fact be unneeded in the first place. The very low rate of additional images adding diagnostic value relates to the frequency with which the specific clinical question could have been answered based on other views in

combination with the suboptimal view. Indeed, common indications for musculoskeletal and chest radiography, such as assessment for a fracture or confirmation of line or catheter placement, may be confidently answered by the radiologist in many instances even in the presence of suboptimal patient positioning

A

B

Fig. 4—55-year-old man with fever who underwent chest radiography. A, Initial radiograph shows artifact from patient’s necklace. Technologist rejected image. B, Repeat radiograph was obtained after removal of necklace. Repeat image was deemed not to improve diagnostic yield.

4

AJR:209, December 2017

American Journal of Roentgenology

Diagnostic Impact of Repeat Musculoskeletal and Chest Radiographs or patient motion during acquisition of a single image. Our findings suggest that the decision to repeat the image is best made in the context of all the available images for the study along with awareness of the clinical question. Thus, new workflows may be warranted to decrease the rate of repeat radiographs. For example, although requiring more time and oversight, the decision may best be made by a supervising radiologist or dedicated senior technologist, rather than by the performing technologist, accessing the potential image to be repeated throughout a remote workstation; an alternative workflow could entail not performing any repeat radiographs until the examination is viewed in its entirety by the interpreting radiologist. Note that the lack of added value of repeat imaging was more pronounced for chest radiographs than for musculoskeletal radiographs, possibly relating to the value of multiple views for assessing more complex anatomy in musculoskeletal imaging. Thus, it may be warranted for technologists to maintain even greater discretion in repeating images or for direct radiologist oversight to be strongly encouraged, for chest imaging. Such efforts would be expected to lower the rate of repeat radiographs, potentially considerably compared with its present level (3.4% per examination, on the basis of an earlier investigation from our institution [Jacobs JE, et al., Radiological Society of North America 2015 annual meeting]). The present study is limited by the fact that the assessments of the added diagnostic value of the repeated images were inherently subjective in nature. In addition, we have not actually implemented in this pilot work our proposed solutions for addressing the issue. We also have

not stratified our results by individual performing technologists, which could be an additional source of variation in our findings. In conclusion, most repeated radiographic images (chest more so than musculoskeletal) did not add significant clinical information or lead to an altered diagnosis, despite adding to the radiation dose to which the patient was exposed. Although a larger study needs to be done to confirm these findings, the present study suggests that the decision to repeat images should be made only after viewing the questionable image in context with all the images in a study and particular clinical indication. This decision may best be made by a radiologist rather than by the performing technologist. References

1. Little KJ, Reiser I, Liu L, et al. Unified database for rejected image analysis across multiple vendors in radiography. J Am Coll Radiol 2017; 14:208–216 2. Dunn M, Rogers A. X-ray film reject analysis as a quality indicator. Radiography 1998; 4:29–31 3. Mettler FA Jr, Huda W, Yoshizumi TT, Mahesh M. Effective doses in radiology and diagnostic nuclear medicine: a catalog. Radiology 2008; 248:254–263 4. RAdiation Dose Assessment Resource (RADAR). RADAR medical procedure radiation dose calculator and consent language generator. RADAR website. www.doseinfo-radar.com/RADARDoseRiskCalc.html. Accessed March 6, 2017 5. Al-Malki MA, Abulfaraj WH, Bhuiyan SI, Kinsara AA. A study on radiographic repeat rate data of several hospitals in Jeddah. Radiat Prot Dosimetry 2003; 103:323–330 6. Andersen ER, Jorde J, Taoussi N, Yaqoob SH,

Konst B, Seierstad T. Reject analysis in direct digital radiography. Acta Radiol 2012; 53:174–178 7. Fintelmann F, Pulli B, Abedi-Tari F, et al. Repeat rates in digital chest radiography and strategies for improvement. J Thorac Imaging 2012; 27:148– 151 8. Foos DH, Sehnert WJ, Reiner B, Siegel EL, Segal A, Waldman DL. Digital radiography reject analysis: data collection methodology, results, and recommendations from an in-depth investigation at two hospitals. J Digit Imaging 2009; 22:89–98 9. Weatherburn GC, Bryan S, West M. A comparison of image reject rates when using film, hard copy computed radiography and soft copy images on picture archiving and communication systems (PACS) workstations. Br J Radiol 1999; 72:653– 660 10. Peer S, Peer R, Walcher M, Pohl M, Jaschke W. Comparative reject analysis in conventional filmscreen and digital storage phosphor radiography. Eur Radiol 1999; 9:1693–1696 11. Akhtar W, Aslam M, Ali A, Mirza K, Ahmad N. Film retakes in digital and conventional radiography. J Coll Physicians Surg Pak 2008; 18:151–153 12. Nol J, Isouard G, Mirecki J. Digital repeat analysis; setup and operation. J Digit Imaging 2006; 19:159–166 13. Jones AK, Polman R, Willis CE, Shepard SJ. One year’s results from a server-based system for performing reject analysis and exposure analysis in computed radiography. J Digit Imaging 2011; 24:243–255 14. Prieto C, Vano E, Ten JI, et al. Image retake analysis in digital radiography using DICOM header information. J Digit Imaging 2009; 22:393–399 15. Tzeng WS, Kuo KM, Liu CF, Yao HC, Chen CY, Lin HW. Managing repeat digital radiography images: a systematic approach and improvement. J Med Syst 2012; 36:2697–2704

AJR:209, December 2017 5

Technologist-Directed Repeat Musculoskeletal and Chest Radiographs: How Often Do They Impact Diagnosis?

Radiologic technologists may repeat images within a radiographic examination because of perceived suboptimal image quality, excluding these original i...
637KB Sizes 0 Downloads 9 Views