计算机科学
计算机辅助设计
模态(人机交互)
人工智能
深度学习
算法
乳腺超声检查
网(多面体)
二元分类
乳腺癌
机器学习
模式识别(心理学)
支持向量机
医学
癌症
乳腺摄影术
数学
工程类
工程制图
内科学
几何学
作者
Yaofei Duan,Patrick Cheong-Iao Pang,Ping He,Rongsheng Wang,Yue Sun,Chuntao Liu,Xiaorong Zhang,Xi-Rong Yuan,Pengjie Song,Chan‐Tong Lam,Ligang Cui,Tao Tan
标识
DOI:10.1109/jbhi.2024.3445952
摘要
Breast cancer significantly impacts women's health, with ultrasound being crucial for lesion assessment. To enhance diagnostic accuracy, computer-aided detection (CAD) systems have attracted considerable interest. This study introduces a prospective deep learning architecture called "Multi-modal Multi-task Network" (3MT-Net). 3MT-Net utilizes a combination of clinical data, B-mode, and color Doppler ultrasound. We have designed the AM-CapsNet network, specifically tailored to extract crucial tumor features from ultrasound. To combine clinical data in 3MT-Net, we have employed a cascaded cross-attention to fuse information from three distinct sources. To ensure the preservation of pertinent information during the fusion of high-dimensional and low-dimensional data, we adopt the idea of ensemble learning and design an optimization algorithm to assign weights to different modalities. Eventually, 3MT-Net performs binary classification of benign and malignant lesions as well as pathological subtype classification. In addition, we retrospectively collected data from nine medical centers. To ensure the broad applicability of the 3MT-Net, we created two separate testsets and conducted extensive experiments. Furthermore, a comparative analysis was conducted between 3MT-Net and the industrial-grade CAD product S-detect. The AUC of 3MT-Net surpasses S-Detect by 1.4% to 3.8%.
科研通智能强力驱动
Strongly Powered by AbleSci AI