• Users Online: 121
  • Print this page
  • Email this page


 
 
Table of Contents
ORIGINAL ARTICLE
Year : 2019  |  Volume : 2  |  Issue : 2  |  Page : 30-33

Radiographs reject analysis in a large tertiary care hospital in Riyadh


1 Department of Medical Imaging, King Abdulaziz Medical City, Riyadh, Saudi Arabia
2 Department of Medical Imaging, King Saud bin Abdulaziz University, Riyadh, Saudi Arabia

Date of Web Publication2-Apr-2019

Correspondence Address:
Khalid A Alyousef
King Abdulaziz Medical City, Riyadh
Saudi Arabia
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/JQSH.JQSH_24_18

Get Permissions

  Abstract 

Background: Analysis of rejected radiographs is an important quality indicator of any radiology department. At King Abdulaziz Medical City (KAMC), about 185,000 radiographs are performed annually. Methods: The rejected radiographs over a period of 5 years were analyzed using a dedicated electronic rejection system. The rejection is performed by a certified radiologist and communicated electronically to the concerned technologist. Results: A total of 455 rejected radiographs were reviewed and analyzed. Of the reviewed rejected radiographs, 247 were adults (60%) whereas 166 were pediatrics (40%). In terms of sex, 231 (56%) of the rejected radiographs were for men and 182 (44%) were for women. The most common reason for rejection was labeling (22%), followed by procedure protocol (20%). Other reasons included positioning (14%), processing (14%), artifacts (13%), wrong documentation (9%), and exposure error (6%). The rejection due to exposure error was very low (6%) owing to the utilization of digital systems that offer a wide exposure latitude. Reported data at hospitals that use analog systems show up to 67% of rejections were due to exposure error. In terms of body parts, the highest rejection was for extremities (43%) followed by chest (31%). The remaining rejected radiographs includes abdomen (9%), spine (8%), pelvis (5%), and head and neck (4%).Conclusion: The outcome of this study can be used to set up training programs to improve radiological services and reduce the unnecessary radiation exposure to the patients.

Keywords: Exposure, Peervue, quality assurance, radiograph, reject analysis, unnecessary radiation


How to cite this article:
Alyousef KA, Alkahtani S, Alessa R, Alruweili H. Radiographs reject analysis in a large tertiary care hospital in Riyadh. Glob J Qual Saf Healthc 2019;2:30-3

How to cite this URL:
Alyousef KA, Alkahtani S, Alessa R, Alruweili H. Radiographs reject analysis in a large tertiary care hospital in Riyadh. Glob J Qual Saf Healthc [serial online] 2019 [cited 2019 Jun 25];2:30-3. Available from: http://www.jqsh.org/text.asp?2019/2/2/30/255337




  Introduction Top


In diagnostic imaging, one of the main goals of a quality assurance (QA) program is to produce consistent high-quality radiographs at a minimum exposure to the patient.[1] Reject analysis is a complementary part of QA program in radiology department.[2],[3],[4] A considerable number of radiographs taken may be rejected because of quality issues. In the USA, 8% of radiographs that are performed annually are rejected due to different reasons.[1] Patients might undergo repeated x-ray examinations after their initial x-ray radiographs are rejected due to different reasons such as poor image quality, artifact, anatomy cutoff, patient motion, and equipment malfunction. Rejected radiographs that are repeated may lead to extra radiation dose to the patient; reduction in the lifetime of the x-ray machines, which added expenses on the health care system; and long waiting list.[5],[6] In the analog system era, rejected radiographs were counted by the number of wasted radiographs that were physically collected from a waste container.[7] Nowadays, rejected digital radiographs are counted by the number of deleted radiographs from the modality.[5],[7] There has been a general belief that the rejection with the analog systems was significant because of the relatively narrow exposure latitude. On the other hand, digital systems were assumed to have reduced the rejection due to more forgiving exposure latitude.[7] In addition to the wider exposure latitude, the advantages of digital over analog systems include processing capabilities and manipulation, which allow technologists to adjust radiographs’ quality.[8] In digital systems, radiographs are more likely to be rejected due to positioning errors or artifacts.[9] On the other hand, in analog systems, exposure error has been reported as the most common reason for rejection, accounting for as high as 67%.[10],[11] In terms of body parts, the most frequently rejected radiographs are extremities and chest, followed by spine, pelvis, head and neck, and abdomen.[9],[12],[13] This is expected due to the fact that extremities and chest are the most common body parts radiographed in imaging department. The purpose of this study was to evaluate the radiographs reject rate and to identify the causes of rejection. This study is the first of its kind and scale to be conducted at King Abdulaziz Medical City (KAMC), Riyadh, Saudi Arabia. The results of this study help to improve radiological services and reduce the unnecessary radiation exposure to the patients.


  Material and Methods Top


KAMC is a large tertiary care hospital in Riyadh, Saudi Arabia with a bed capacity of 1600. A retrospective cross-sectional study of rejected radiographs was conducted in the radiology department at KAMC. The department has 22 digital radiography and six computed radiography units. About 185,000 radiographs are performed annually. Rejected radiographs over a period of approximately 5 years (from January 2013 to August 2017) were retrieved and analyzed using a dedicated electronic rejection system “Peervue” (Change Healthcare Company, Nashville, Tennessee, USA). This system is an intelligent software that helps in automating and simplifying workflow and communication within diagnostic imaging department. The rejection is performed by a certified radiologist and communicated electronically through the Peervue system to the concerned technologist. The radiologist submits feedback through the system. Cases are then presented on work lists for the supervisors. The supervisors will review cases with performing technologists. Radiographic examinations, reasons for rejection, body parts, sex, and age were recorded. When rejecting a radiograph, the reason for rejection had to be selected among eight predefined reasons in Peervue system. All plain radiographs performed during that period were included. Magnetic resonance imaging, computed tomography, nuclear medicine, ultrasound, fluoroscopy, and mammography were excluded from the study. A total of 455 rejected radiographs were collected, reviewed, and analyzed. Forty-two radiographs were excluded from the total because they were rejected due to documentation issues and not quality reasons. Cases were categorized according to reason for rejection. The categories include labeling, procedure protocol, positioning, processing, artifacts, wrong documentation, exposure errors, and others. Internal Review Board approval was obtained from King Abdullah International Medical Research Center. Descriptive statistics was used with Microsoft Excel 2010 to calculate the frequency and percentage of reject rates.


  Results Top


A total of 247 rejected radiographs were for adult patients and 166 for pediatric patients (60% and 40%, respectively). It is worth noting that KAMC classifies patients younger than 14 years as pediatrics. In terms of sex, 231 (56%) of the rejected radiographs were for male patients and 182 (44%) were for female patients. There is no standard method of categorizing the reasons for rejection in the literature.[1] For example, patient motion is sometimes considered as positioning error and other times as artifact. This may explain variations in the reported data on reasons for rejection. In our study, we assigned the rejected radiographs into eight different reasons, namely, labeling, procedure protocol, positioning, processing, artifacts, wrong documentation, exposure error, and others.

The most common reason for rejection in our study was labeling with a total of 101 radiographs (22%), followed by procedure protocol, 92 (20%). Positioning error, on the other hand, accounted for 14% of the rejections. [Figure 1] shows the distribution of reasons for rejection. In the figure, the rejections classified under “other” included lack of proper protection (four radiographs) and missing clinical information (three radiographs). In terms of body parts, majority of the rejected radiographs were extremities 180 (43%) and chest 127 (31%). Other rejected radiographs included abdomen 36 (9%), spine 33 (8%), pelvis 22 (5%), and head and neck 15 (4%) [Figure 2].
Figure 1: Rejected radiographs by reason for rejection.

Click here to view
,
Figure 2: Rejected radiographs by body part.

Click here to view



  Discussion Top


Reject analysis is an important quality measure of modern radiology department. There is currently lack of sufficient reported data of reject analysis in the Kingdom of Saudi Arabia. Number and type of rejected radiographs will vary depending on whether it is a teaching hospital or not, and whether it is a specialized or general hospital. It will also depend on whether the hospital is equipped with digital or analog units. KAMC is a tertiary care hospital comprising large general hospital, surgical trauma center, advanced cardiac center, and a dedicated specialized children hospital. In our study, among the various reasons for rejection, labeling constitutes the highest percentage of rejected radiographs (22%). It has been reported that the introduction of digital markers causes technologists to ignore using the physical markers, which is considered unacceptable and a reason for rejection.[2] A number of studies reported positioning and exposure errors as the most important reasons for rejection. [Table 1] compares positioning and exposure errors in our study with similar studies in the literature. A wide variation in the percentage of positioning errors can be seen. This might be due to the variability in defining positioning error. For instance, some researchers include labeling as part of positioning error whereas others do not.[14],[15],[16],[17],[18],[19] American Association of Physicists in Medicine (AAPM) Task Group 151 reported up to three-fold variation in the percentage of rejection among different hospitals, depending on type of practice and setting.[1] Rejections due to exposure errors (over and under exposure) were comparable with results from other studies [Table 1]. The relatively low rejection rate due to exposure error is a result of the utilization of digital systems that offer extended dynamic range. Although constituting 40–60% of all rejects in the analog systems, the relative contribution of exposure errors in the digital systems has been reduced to less than 15%.[12] In KAMC, the rejection due to artifact is 13%. Comparable studies show rejection due to artifact ranging from 7% up to 11% depending on the type of hospital.[9] The slightly higher percentage was probably due to the fact that we included patient motion under artifact (motion artifact). Artifacts were seen more frequently with female patients compared with males (57% and 43%, respectively). This is anticipated because artifacts are more likely to be seen on female patients due to presence of hair pins, hair clips, and accessories. The introduction of picture archiving and communication systems has generated new type of rejects such as the incorrect choice of processing algorithm.[7] Processing errors accounted for 14% of the total reject rate in our study. There is, however, lack of published data on processing errors to compare our results with. In terms of examination type, extremities and chest counted for over 70% of the number of images rejected. Our results were consistent with published data [Table 2].[9],[12],[18] Traditionally, the decision to reject an image for quality reason is made by the radiographer. The causes of rejects are in many cases defined and reported with reference to radiographer’s subjective evaluations.[7],[19-27] Unnecessary repeats were found to have been caused by poor technical judgment and no availability of radiologist for advice.[28] The utilization of the Peervue system has allowed the radiologist to review the images, assess the quality, and reject those images that do not add diagnostic information. The rejection is based on a more consistent quality standard. The radiologists are therefore taking a more active role in the guidance and training of the radiographers. The system is integrated with the Radiology Information System so that the radiologist decision and reason for rejection are communicated electronically to the appropriate radiographer immediately. This process significantly enhances the image quality discussion between radiologists and radiographers and enlightened the radiographers on the image quality criteria expected by the radiologists.
Table 1: Analysis of rejected radiographs by reason for rejection

Click here to view
,
Table 2: Analysis of rejected radiographs by anatomical body part

Click here to view


The AAPM recommend rejection analysis to be performed on at least a quarterly basis, but preferably monthly.[1] The automation of the process of rejection allows us to continuously monitor rejection rate and take necessary action in a timely manner. Results of the analysis of the rejection can be utilized to plan for training programs and continuous education and as a baseline for need assessment.


  Conclusion Top


Radiograph reject analysis has been used to assess the quality of radiographic examinations in the KAMC radiology department. In our study, we have identified that the most common reason for rejection is labeling followed by procedure protocol and positioning errors. The utilization of digital radiography systems helped keep the exposure errors minimal. Chest and extremities radiographs constitute the highest body part subject for rejection. The proposed method of electronic communication of the rejection reasons provides an efficient method for collecting and analyzing the reject rate. It is recommended that a review of the rejected radiographs should be made on periodic basis as part of the overall QA program of the hospital to identify staff competency gaps and plan for appropriate training.

Financial support and sponsorship

The authors disclosed no funding related to this article.

Conflicts of interest

The authors disclosed no conflicts of interest related to this article.



 
  References Top

1.
Jones AK, Heintz P, Geiser W, et al Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151. Med Phys 2015;42:6658–6670.  Back to cited text no. 1
    
2.
Taylor N. The art of rejection: Comparative analysis between computed radiography (CR) and digital radiography (DR) workstations in the accident & emergency and general radiology departments at a district general hospital using customised and standardised reject criteria over a three year period. Radiography 2015;21:236–241.  Back to cited text no. 2
    
3.
Erturk SM, Ondategui-Parra S, Ros PR. Quality management in radiology: Historical aspects and basic definitions. J Am Coll Radiol 2005;2:985–991.  Back to cited text no. 3
    
4.
Owusu-Banahene J, Darko E, Hasford F, et al. Film reject analysis and image quality in diagnostic radiology department of a teaching hospital in Ghana. J Radiat Res Appl Sci 2014;7:589–594.  Back to cited text no. 4
    
5.
Hofmann B, Rosanowsky TB, Jensen C, et al. Image rejects in general direct digital radiography. Acta Radiol Open 2015;4:2058460115604339.  Back to cited text no. 5
    
6.
Khan S, Zahir MZ, Khan J, et al. Frequency of common causes of rejected/repeated chest x-rays in radiology department of a teaching hospital. Gomal J Med Sci 2016;14:164–166.  Back to cited text no. 6
    
7.
Waaler D, Hofmann B. Image rejects/retakes–radiographic challenges. Radiat Prot Dosimetry 2010;139:375–379.  Back to cited text no. 7
    
8.
Shepard SJ, Wang J, Flynn M, et al An exposure indicator for digital radiography: AAPM task group 116 (executive summary). Med Phys 2009;36:2898–2914.  Back to cited text no. 8
    
9.
Foos DH, Sehnert WJ, Reiner B, et al. Digital radiography reject analysis: Data collection methodology, results, and recommendations from an in-depth investigation at two hospitals. J Digit Imaging 2009;22:89–98.  Back to cited text no. 9
    
10.
Yousef M, Edward C, Ahmed H, et al. Film reject analysis for conventional radiography in Khartoum Hospitals. Asian J Med Radiol Res 2013;1:34-38.  Back to cited text no. 10
    
11.
Akhtar W, Aslam M, Ali A, et al. Film retakes in digital and conventional radiography. J Coll Physicians Surg Pak 2008;18:151–153.  Back to cited text no. 11
    
12.
Jabbari N, Zeinali A, Rahmatnezhad L. Patient dose from radiographic rejects/repeats in radiology centers of Urmia University of Medical Sciences, Iran. Health 2012;4:94–100.  Back to cited text no. 12
    
13.
Shepard S, Lin P, Boone J, et al. Quality control in diagnostic radiology: Report of AAPM Task Group 12 Diagnostic Imaging Committee. Med Phys 2002;29:11–12.  Back to cited text no. 13
    
14.
Weatherburn GC, Bryan S, West M. A comparison of image reject rates when using film, hard copy computed radiography and soft copy images on picture archiving and communication systems (PACS) workstations. Br J Radiol 1999;72:653–660.  Back to cited text no. 14
    
15.
Peer S, Peer R, Walcher M, et al. Comparative reject analysis in conventional film-screen and digital storage phosphor radiography. Eur Radiol 1999;9:1693–1696.  Back to cited text no. 15
    
16.
Honea R, Elissa Blado M, Ma Y. Is reject analysis necessary after converting to computed radiography? J Digit Imaging 2002;15(Suppl 1):41–52.  Back to cited text no. 16
    
17.
Lau S, Mak A, Lam W, et al. Reject analysis: A comparison of conventional film-screen radiography and computed radiography with PACS. Radiography 2004;10:183–187.  Back to cited text no. 17
    
18.
Nol J, Isouard G, Mirecki J. Digital repeat analysis; setup and operation. J Digit Imaging 2006;19:159–166.  Back to cited text no. 18
    
19.
Minnigh TR, Gallet J. Maintaining quality control using a radiological digital X-ray dashboard. J Digit Imaging 2009;22:84–88.  Back to cited text no. 19
    
20.
Danial Z, Seife T, Daniel A. X-ray reject analysis in Tikur Anbessa and Bethzatha hospitals. Ethiop J Health 2008;22:63–67.  Back to cited text no. 20
    
21.
Dunn M, Rogers A. X-Ray film reject analysis as a quality indicator radiography. 1998;4:29–31.  Back to cited text no. 21
    
22.
Polman R, Jones A, Willis C, et al. Reject analysis tool. In: Proceedings of Society for Information Management in Medicine (SIIM); 2008; SIIM, Leesburg, VA; 38–40.  Back to cited text no. 22
    
23.
Jones AK, Polman R, Willis CE, et al. One year’s results from a server-based system for performing reject analysis and exposure analysis in computed radiography. J Digit Imaging 2011;24:243–255.  Back to cited text no. 23
    
24.
Adler A, Carlton R, Wold B. An analysis of radiographic repeat and reject rates. Radiol Technol 1992;63:308–314.  Back to cited text no. 24
    
25.
Sadiq A, Miftaudeen N, Mohammed A, et al. Reject–repeat analysis of plain radiographs as a quality indicator at University of Malduguri Teaching Hospital (UMTH). Eur J Pharm Med Res 2017;4:188–191.  Back to cited text no. 25
    
26.
Joseph Z, Mohammed S, Samuel S, et al. Film reject analysis in radiology department of a teaching hospital in North-Eastern Nigeria. Niger J Med Imaging Radiat Oncol 2015;4:21–27.  Back to cited text no. 26
    
27.
Whaley JS, Pressman BD, Wilson JR, et al. Investigation of the variability in the assessment of digital chest X-ray image quality. J Digit Imaging 2013;26:217–226.  Back to cited text no. 27
    
28.
James N, Godfrey I, Jerzy M. Uncovering the causes of unnecessary repeated medical imaging examinations, or part of, in two hospital departments. The Radiographer 2005;52:26–31.  Back to cited text no. 28
    


    Figures

  [Figure 1], [Figure 2]
 
 
    Tables

  [Table 1], [Table 2]



 

Top
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
Abstract
Introduction
Material and Methods
Results
Discussion
Conclusion
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed422    
    Printed38    
    Emailed0    
    PDF Downloaded12    
    Comments [Add]    

Recommend this journal