医学
放射科
计算机断层摄影术
计算机断层血管造影
血管造影
断层摄影术
核医学
作者
Luca Saba,Roberta Scicolone,Gian Luca Chabert,Daniël Bos,Anna Kopczak,Mahmud Mossa‐Basha,Jaewoo Song,Roberto Sanfilippo,Andreas Schindler,Andrew Nicolaides,M. Eline Kooi,Tobias Saam,Riccardo Cau,Giuseppe Lanzino
摘要
Atherosclerotic disease of the carotid arteries is a major cause of ischemic stroke. Traditionally, the degree of stenosis has been regarded as the primary parameter for predicting stroke risk, but emerging evidence highlights the importance of carotid plaque composition and morphology. Recently, the Carotid Plaque-Reporting and Data System (RADS) has been introduced to standardize carotid plaque assessment beyond the degree of carotid stenosis. However, its reliability in routine radiological practice has yet to be established. This study assesses the inter-reader agreement, intra-reader agreement and learning curve associated with Carotid Plaque-RADS. In this retrospective study 500 subjects who underwent computed tomography angiography (CTA) for suspected carotid atherosclerosis were assessed. Three readers with varying experience levels in vascular imaging independently evaluated all CTAs in five blocks using Carotid Plaque-RADS. To assess the impact of reader experience and potential improvement over time, inter-reader agreement between the three pairs of readers was calculated for each block using Cohen's kappa (κ), enabling a comparison of agreement sequentially across the blocks. Intra-reader agreement was calculated on a random block of 100 patients (192 carotid arteries). After exclusion of low-quality exams, 490 patients were selected for analysis, but 46 carotids were excluded due to previous revascularization procedures. The remaining 934 carotid arteries were assessed. The agreement was substantial between Expert and Intermediate readers, ranging from κ=0.78 to κ=0.88, moderate between Intermediate and Beginner readers, ranging from κ=0.50 to κ=0.74, and between Expert and Beginner improved from substantial (κ=0.68) to almost optimal (κ=0.86) across blocks, indicating the effect of a learning curve. The inter-reader percent agreement was best for Plaque-RADS category 1-2-3 and poorest for 4. Intra-reader agreement was substantial for the Beginner (κ=0.77) and almost perfect for both the Intermediate and Expert readers (κ=0.88). In the CTA application of the Carotid Plaque-RADS, inter-reader agreement is substantial to near perfect among experienced and intermediate readers, with a notable learning curve for beginners. Intra-reader agreement is almost perfect in experienced and intermediate readers, indicating consistency of their grading, ensuring data reproducibility using Plaque-RADS. RADS: Reporting and Data System; CTA: Computed Tomography Angiography; HU: Hounsfield Unit; IPH: Intra-plaque Hemorrhage; FC: Fibrous Cap; MWT: Maximum Wall Thickness; κ: Cohen's kappa; LRNC: Lipid-Rich Necrotic Core.
科研通智能强力驱动
Strongly Powered by AbleSci AI