旁侵犯
结直肠癌
医学
算法
癌
淋巴结
计算机科学
淋巴血管侵犯
病理
瘤芽
肿瘤科
转移
癌症
内科学
淋巴结转移
作者
Reetesh K. Pai,Douglas J. Hartman,David F. Schaeffer,Christophe Rosty,Sameer Shivji,Richard Kirsch,Rish K. Pai
摘要
To develop and validate a deep learning algorithm to quantify a broad spectrum of histological features in colorectal carcinoma.A deep learning algorithm was trained on haematoxylin and eosin-stained slides from tissue microarrays of colorectal carcinomas (N = 230) to segment colorectal carcinoma digitised images into 13 regions and one object. The segmentation algorithm demonstrated moderate to almost perfect agreement with interpretations by gastrointestinal pathologists, and was applied to an independent test cohort of digitised whole slides of colorectal carcinoma (N = 136). The algorithm correctly classified mucinous and high-grade tumours, and identified significant differences between mismatch repair-proficient and mismatch repair-deficient (MMRD) tumours with regard to mucin, inflammatory stroma, and tumour-infiltrating lymphocytes (TILs). A cutoff of >44.4 TILs per mm2 carcinoma gave a sensitivity of 88% and a specificity of 73% in classifying MMRD carcinomas. Algorithm measures of tumour budding (TB) and poorly differentiated clusters (PDCs) outperformed TB grade derived from routine sign-out, and compared favourably with manual counts of TB/PDCs with regard to lymphatic, venous and perineural invasion. Comparable associations were seen between algorithm measures of TB/PDCs and manual counts of TB/PDCs for lymph node metastasis (all P < 0.001); however, stronger correlations were seen between the proportion of positive lymph nodes and algorithm measures of TB/PDCs. Stronger associations were also seen between distant metastasis and algorithm measures of TB/PDCs (P = 0.004) than between distant metastasis and TB (P = 0.04) and TB/PDC counts (P = 0.06).Our results highlight the potential of deep learning to identify and quantify a broad spectrum of histological features in colorectal carcinoma.
科研通智能强力驱动
Strongly Powered by AbleSci AI