概率逻辑
人工智能
医学影像学
计算机科学
图像(数学)
机器学习
计算机视觉
模式识别(心理学)
作者
Yiwen Luo,Wuyang Li,Cheng Chen,Xiang Li,Tianming Liu,Tianye Niu,Yixuan Yuan
标识
DOI:10.1109/tmi.2025.3566105
摘要
Deep learning-based traditional diagnostic models typically exhibit limitations when applied to dynamic clinical environments that require handling the emergence of new diseases. Continual learning (CL) offers a promising solution, aiming to learn new knowledge while preserving previously learned knowledge. Though recent rehearsal-free CL methods employing prompt tuning (PT) have shown promise, they rely on deterministic prompts that struggle to handle diverse fine-grained knowledge. Moreover, existing PT methods utilize randomly initialized prompts that are trained under standard classification constraints, impeding expert knowledge integration and optimal performance acquisition. In this paper, we propose an LLM-guided Decoupled Probabilistic Prompt (LDPP) for Continual Learning in medical image diagnosis. Specifically, we develop an Expert Knowledge Generation (EKG) module that leverages LLM to acquire decoupled expert knowledge and comprehensive category descriptions. Then, we introduce a Decoupled Probabilistic Prompt pool (DePP) to construct a shared decoupled probabilistic prompt pool, which constructs a shared prompt pool with probabilistic prompts derived from the expert knowledge set. These prompts dynamically provide diverse and flexible descriptions for input images. Finally, We design a Steering Prompt Pool (SPP) to enhance intra-class compactness and promote model performance by learning nonshared prompts. With extensive experimental validation, LDPP consistently sets state-of-the-art performance under the challenging class-incremental setting in CL. Code is available at: https://github.com/CUHK-AIM-Group/LDPP.
科研通智能强力驱动
Strongly Powered by AbleSci AI