1. Systematic Review and Meta-Analysis of American College of Radiology TI-RADS Inter-Reader Reliability for Risk Stratification of Thyroid Nodules
- Author
-
Wei Li, Yuan Sun, Haibing Xu, Wenwen Shang, and Anding Dong
- Subjects
thyroid nodule ,ultrasonography ,reproducibility of results ,classification ,meta-analysis ,Neoplasms. Tumors. Oncology. Including cancer and carcinogens ,RC254-282 - Abstract
PurposeTo investigate the inter-reader agreement of using the American College of Radiology (ACR) Thyroid Imaging Reporting and Data System (TI-RADS) for risk stratification of thyroid nodules.MethodsA literature search of Web of Science, PubMed, Cochrane Library, EMBASE, and Google Scholar was performed to identify eligible articles published from inception until October 31, 2021. We included studies reporting inter-reader agreement of different radiologists who applied ACR TI-RADS for the classification of thyroid nodules. Quality assessment of the included studies was performed with the Quality Assessment of Diagnostic Accuracy Studies-2 tool and Guidelines for Reporting Reliability and Agreement Studies. The summary estimates of the inter-reader agreement were pooled with the random-effects model, and multiple subgroup analyses and meta-regression were performed to investigate various clinical settings.ResultsA total of 13 studies comprising 5,238 nodules were included in the current meta-analysis and systematic review. The pooled inter-reader agreement for overall ACR TI-RADS classification was moderate (κ = 0.51, 95% CI 0.42–0.59). Substantial heterogeneity was presented throughout the studies, and meta-regression analyses suggested that the malignant rate was the significant factor. Regarding the ultrasound (US) features, the best inter-reader agreement was composition (κ = 0.58, 95% CI 0.53–0.63), followed by shape (κ = 0.57, 95% CI 0.41–0.72), echogenicity (κ = 0.50, 95% CI 0.40–0.60), echogenic foci (κ = 0.44, 95% CI 0.36–0.53), and margin (κ = 0.34, 95% CI 0.24–0.44).ConclusionsThe ACR TI-RADS demonstrated moderate inter-reader agreement between radiologists for the overall classification. However, the US feature of margin only showed fair inter-reader reliability among different observers.
- Published
- 2022
- Full Text
- View/download PDF