计算机科学
人工智能
污渍
背景(考古学)
学习迁移
模式识别(心理学)
机器学习
计算机视觉
病理
医学
染色
古生物学
生物
标识
DOI:10.1007/978-3-031-43987-2_53
摘要
The commonly presented histology stain variation may moderately obstruct the diagnosis of human experts, but can considerably downgrade the reliability of deep learning models in various diagnostic tasks. Many stain style transfer methods have been proposed to eliminate the variance of stain styles across different medical institutions or even different batches. However, existing solutions are confined to Generative Adversarial Networks (GANs), AutoEncoders (AEs), or their variants, and often fell into the shortcomings of mode collapses or posterior mismatching issues. In this paper, we make the first attempt at a Diffusion Probabilistic Model to cope with the indispensable stain style transfer in histology image context, called StainDiff. Specifically, our diffusion framework enables learning from unpaired images by proposing a novel cycle-consistent constraint, whereas existing diffusion models are restricted to image generation or fully supervised pixel-to-pixel translation. Moreover, given the stochastic nature of StainDiff that multiple transferred results can be generated from one input histology image, we further boost and stabilize the performance by the proposal of a novel self-ensemble scheme. Our model can avoid the challenging issues in mainstream networks, such as the mode collapses in GANs or alignment between posterior distributions in AEs. In conclusion, StainDiff suffices to increase the stain style transfer quality, where the training is straightforward and the model is simplified for real-world clinical deployment.
科研通智能强力驱动
Strongly Powered by AbleSci AI