马尔科夫蒙特卡洛
先验概率
贝叶斯概率
数学
规范化(社会学)
反问题
高斯分布
后验概率
算法
应用数学
贝叶斯推理
数学优化
计算机科学
统计
量子力学
物理
社会学
数学分析
人类学
作者
Tiangang Cui,Xin T. Tong,Olivier Zahm
出处
期刊:Inverse Problems
[IOP Publishing]
日期:2022-09-28
卷期号:38 (12): 124002-124002
被引量:11
标识
DOI:10.1088/1361-6420/ac9582
摘要
Abstract Markov chain Monte Carlo (MCMC) methods form one of the algorithmic foundations of Bayesian inverse problems. The recent development of likelihood-informed subspace (LIS) methods offers a viable route to designing efficient MCMC methods for exploring high-dimensional posterior distributions via exploiting the intrinsic low-dimensional structure of the underlying inverse problem. However, existing LIS methods and the associated performance analysis often assume that the prior distribution is Gaussian. This assumption is limited for inverse problems aiming to promote sparsity in the parameter estimation, as heavy-tailed priors, e.g., Laplace distribution or the elastic net commonly used in Bayesian LASSO, are often needed in this case. To overcome this limitation, we consider a prior normalization technique that transforms any non-Gaussian (e.g. heavy-tailed) priors into standard Gaussian distributions, which makes it possible to implement LIS methods to accelerate MCMC sampling via such transformations. We also rigorously investigate the integration of such transformations with several MCMC methods for high-dimensional problems. Finally, we demonstrate various aspects of our theoretical claims on two nonlinear inverse problems.
科研通智能强力驱动
Strongly Powered by AbleSci AI