Electrical Impedance Tomography Deep Imaging With Dual-Branch U-Net Based on Deformable Convolution and Hyper-Convolution

卷积(计算机科学) 电阻抗断层成像 断层摄影术 电阻抗 重叠-添加方法 对偶(语法数字) 卷积定理 计算机科学 迭代重建 人工智能 物理 光学 数学分析 数学 傅里叶变换 人工神经网络 量子力学 艺术 傅里叶分析 文学类 分数阶傅立叶变换
作者
Zichen Wang,Xiuyan Li,Yukuan Sun,Qi Wang
出处
期刊:IEEE Transactions on Instrumentation and Measurement [Institute of Electrical and Electronics Engineers]
卷期号:73: 1-16 被引量:2
标识
DOI:10.1109/tim.2024.3369160
摘要

Electrical impedance tomography (EIT) is a new imaging modality for non-ionizing radiation that uses a safe current applied to the surface and measures the response voltage on a boundary sensor to solve an inverse problem of determining the conductivity distribution in the region of interest. Due to the 'soft-field' nature of the electrical field, the boundaries of the inclusions are usually blurred and the conductivity parameters are inaccurate in the reconstructions. To address the above problems, this paper proposes a learning-based two-branch U-Net deep imaging architecture, named DHU-Net, for the accurate and sharp reconstruction of EIT images. Specifically, deformable convolution layers are introduced to improve the representation of shape and spatial information by a small convolutional kernel, while SE-Attention is used to recalibrate the channel-wise features for global distributions; on the other hand, an implicit hyper-convolutional network with coordinate attention is used to construct the relationship between the spatial coordinates of the convolutional kernel and corresponding weights, so that the large convolutional kernel for conductivity recovery has a relatively lower parameter while having better convolutional robustness. DHU-Net is trained and fine-tuned using a large number of simulation samples and validated on tank experiments. The metrics show that the RMSE is 2.0637, the SSIM is 0.9522, and the RSNR is 46.4597 (the metrics are lower than those of the TR method by 68.76%, 56.53%, and 61.00%) which provides better reconstruction visualization and consistency between the quantification metrics and reconstruction results compared to the state-of-the-art models. The experimental results show that the DHU-Net proposed in this paper has better robustness and reconstruction consistency, suggesting that deformable convolution and implicit neural networks have better expressive capabilities in shape modeling, as well as less computational costs and faster inference than dense-based models, which can drive real-time applications of EIT for structural and functional imaging.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
Ava应助沉静烨伟采纳,获得10
1秒前
Mindray完成签到,获得积分10
4秒前
5秒前
9秒前
10秒前
kong发布了新的文献求助20
11秒前
14秒前
沉静烨伟发布了新的文献求助10
14秒前
123456完成签到 ,获得积分10
15秒前
清秀孤丝发布了新的文献求助10
16秒前
franca2005完成签到 ,获得积分10
17秒前
拼搏的青雪完成签到 ,获得积分10
19秒前
李爱国应助雨过天晴采纳,获得10
20秒前
111完成签到 ,获得积分10
22秒前
ll完成签到 ,获得积分10
28秒前
kong完成签到,获得积分10
29秒前
刘刘完成签到,获得积分10
30秒前
30秒前
赘婿应助清秀孤丝采纳,获得10
31秒前
海盐气泡水完成签到 ,获得积分10
32秒前
雪白智宸完成签到 ,获得积分10
33秒前
YC完成签到,获得积分10
33秒前
雨过天晴发布了新的文献求助10
34秒前
忧郁的猕猴桃完成签到,获得积分10
37秒前
法外潮湿宝贝完成签到 ,获得积分10
41秒前
负数发布了新的文献求助10
42秒前
ri_290完成签到,获得积分10
42秒前
aaa发布了新的文献求助10
43秒前
肯德鸭完成签到,获得积分10
46秒前
祖f完成签到,获得积分10
48秒前
脑洞疼应助科研通管家采纳,获得10
50秒前
在水一方应助科研通管家采纳,获得10
51秒前
顾矜应助科研通管家采纳,获得10
51秒前
51秒前
51秒前
cdercder应助科研通管家采纳,获得10
51秒前
CipherSage应助科研通管家采纳,获得10
51秒前
大个应助科研通管家采纳,获得10
51秒前
留白完成签到 ,获得积分10
58秒前
涂涂完成签到 ,获得积分10
59秒前
高分求助中
【此为提示信息,请勿应助】请按要求发布求助,避免被关 20000
ISCN 2024 – An International System for Human Cytogenomic Nomenclature (2024) 3000
Continuum Thermodynamics and Material Modelling 2000
Encyclopedia of Geology (2nd Edition) 2000
105th Edition CRC Handbook of Chemistry and Physics 1600
Maneuvering of a Damaged Navy Combatant 650
Mindfulness and Character Strengths: A Practitioner's Guide to MBSP 380
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3776082
求助须知:如何正确求助?哪些是违规求助? 3321667
关于积分的说明 10206556
捐赠科研通 3036733
什么是DOI,文献DOI怎么找? 1666435
邀请新用户注册赠送积分活动 797459
科研通“疑难数据库(出版商)”最低求助积分说明 757841