亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

PhaseNet 2.0: Phase Unwrapping of Noisy Data Based on Deep Learning Approach

计算机科学 合成孔径雷达 算法 人工智能 深度学习 相(物质) 像素 卷积神经网络 相位展开 干涉测量 物理 天文 化学 有机化学
作者
G. E. Spoorthi,Rama Krishna Gorthi,Subrahmanyam Gorthi
出处
期刊:IEEE transactions on image processing [Institute of Electrical and Electronics Engineers]
卷期号:29: 4862-4872 被引量:176
标识
DOI:10.1109/tip.2020.2977213
摘要

Phase unwrapping is an ill-posed classical problem in many practical applications of significance such as 3D profiling through fringe projection, synthetic aperture radar and magnetic resonance imaging. Conventional phase unwrapping techniques estimate the phase either by integrating through the confined path (referred to as path-following methods) or by minimizing the energy function between the wrapped phase and the approximated true phase (referred to as minimum-norm approaches). However, these conventional methods have some critical challenges like error accumulation and high computational time and often fail under low SNR conditions. To address these problems, this paper proposes a novel deep learning framework for unwrapping the phase and is referred to as "PhaseNet 2.0". The phase unwrapping problem is formulated as a dense classification problem and a fully convolutional DenseNet based neural network is trained to predict the wrap-count at each pixel from the wrapped phase maps. To train this network, we simulate arbitrary shapes and propose new loss function that integrates the residues by minimizing the difference of gradients and also uses L 1 loss to overcome class imbalance problem. The proposed method, unlike our previous approach PhaseNet, does not require post-processing, highly robust to noise, accurately unwraps the phase even at the severe noise level of -5 dB, and can unwrap the phase maps even at relatively high dynamic ranges. Simulation results from the proposed framework are compared with different classes of existing phase unwrapping methods for varying SNR values and discontinuity, and these evaluations demonstrate the advantages of the proposed framework. We also demonstrate the generality of the proposed method on 3D reconstruction of synthetic CAD models that have diverse structures and finer geometric variations. Finally, the proposed method is applied to real-data for 3D profiling of objects using fringe projection technique and digital holographic interferometry. The proposed framework achieves significant improvements over existing methods while being highly efficient with interactive frame-rates on modern GPUs.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
可可完成签到 ,获得积分10
39秒前
43秒前
万能图书馆应助箴言Julius采纳,获得10
1分钟前
ljx完成签到 ,获得积分10
1分钟前
1分钟前
箴言Julius发布了新的文献求助10
1分钟前
隐形曼青应助yangbin710采纳,获得10
3分钟前
3分钟前
3分钟前
yangbin710发布了新的文献求助10
3分钟前
量子星尘发布了新的文献求助10
4分钟前
箴言Julius完成签到,获得积分10
4分钟前
毛姑朵花完成签到 ,获得积分10
5分钟前
彭于晏应助科研通管家采纳,获得10
5分钟前
orixero应助科研通管家采纳,获得10
5分钟前
5分钟前
6分钟前
Zzz_Carlos完成签到 ,获得积分10
6分钟前
箴言Julius关注了科研通微信公众号
7分钟前
小蘑菇应助科研通管家采纳,获得10
7分钟前
7分钟前
7分钟前
WU发布了新的文献求助10
7分钟前
小羊咩完成签到 ,获得积分0
7分钟前
浮游应助null采纳,获得10
8分钟前
8分钟前
8分钟前
爱思考的小笨笨完成签到,获得积分10
8分钟前
9分钟前
9分钟前
joanna完成签到,获得积分10
9分钟前
9分钟前
深情安青应助科研通管家采纳,获得10
9分钟前
牛幻香发布了新的文献求助10
9分钟前
9分钟前
ash_alice完成签到,获得积分10
10分钟前
10分钟前
量子星尘发布了新的文献求助10
11分钟前
科研通AI2S应助momo采纳,获得10
11分钟前
11分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Sociologies et cosmopolitisme méthodologique 400
Why America Can't Retrench (And How it Might) 400
Another look at Archaeopteryx as the oldest bird 390
Parenchymal volume and functional recovery after clamped partial nephrectomy: potential discrepancies 300
Optimization and Learning via Stochastic Gradient Search 300
Higher taxa of Basidiomycetes 300
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 催化作用 遗传学 冶金 电极 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 4682475
求助须知:如何正确求助?哪些是违规求助? 4057873
关于积分的说明 12545632
捐赠科研通 3753382
什么是DOI,文献DOI怎么找? 2073023
邀请新用户注册赠送积分活动 1101992
科研通“疑难数据库(出版商)”最低求助积分说明 981274