可解释性
人工智能
深度学习
机器学习
稳健性(进化)
计算机科学
一致性(知识库)
模态(人机交互)
模式治疗法
临床实习
医学
缺少数据
传感器融合
人工神经网络
医学影像学
图形
数据挖掘
前列腺癌
深层神经网络
精密医学
组分(热力学)
监督学习
临床决策
代表(政治)
作者
Yingming Xiao,Shengke Yang,Mingjing He,Li Chen,Yi Wu,Lei Zhong
标识
DOI:10.1038/s41746-025-02295-6
摘要
Multimodal clinical data, including imaging, pathology, omics, and laboratory tests, are often fragmented in routine practice, leading to inconsistent decision-making in the management of urological cancers. We propose UroFusion-X, a unified multimodal framework for integrated diagnosis, molecular subtyping, and prognosis prediction of bladder, kidney, and prostate cancers, with inherent robustness to missing modalities. The system incorporates 3D imaging encoders, pathology multiple-instance learning, omics graph networks, and a TabTransformer for laboratory and clinical variables. A cross-modal co-attention mechanism combined with a gated product-of-experts fusion strategy enables effective representation alignment across heterogeneous inputs, while anatomy-pathology consistency constraints and patient-level contrastive learning further enhance interpretability and generalization. Prognostic modeling is achieved via DeepSurv and DeepHit survival heads. Evaluated on a multi-center real-world cohort with external validation and leave-one-center-out testing, UroFusion-X consistently outperformed strong unimodal and simple fusion baselines, maintained over 90% of its predictive performance under substantial modality dropout, and demonstrated higher net clinical benefit in decision curve analysis. These results indicate that the proposed framework can improve decision consistency and reduce unnecessary testing when deployed in real clinical workflows.
科研通智能强力驱动
Strongly Powered by AbleSci AI