贝叶斯概率
计算机科学
反演(地质)
人工智能
模式识别(心理学)
算法
机器学习
地质学
构造盆地
古生物学
作者
Yuhui Song,Zijun Gong,Yuanzhu Chen,Cheng Li
标识
DOI:10.1109/tsp.2024.3484908
摘要
Sparse Bayesian Learning (SBL) has emerged as a powerful tool for sparse signal recovery, due to its superior performance. However, the practical implementation of SBL faces a significant computational complexity associated with matrix inversion. Despite numerous efforts to alleviate this issue, existing methods are often limited to specifically structured sparse signals. This paper aims to provide a universal inversion-free approach to accelerate existing SBL algorithms. We unify the optimization of SBL variants with different priors within the expectation-maximization (EM) framework, where a lower bound of the likelihood function is maximized. Due to the linear Gaussian model foundation of SBL, updating this lower bound requires maximizing a quadratic function, which involves matrix inversion. Thus, we employ the minorization-maximization (MM) framework to derive two novel lower bounds that diagonalize the quadratic coefficient matrix, thereby eliminating the need for any matrix inversions. We further investigate their properties, including convergence guarantees under the MM framework and the slow convergence rate due to reduced curvature. The proposed approach is applicable to various types of structured sparse signals, such as row-sparse, block-sparse, and burst-sparse signals. Our simulations on synthetic and real data demonstrate remarkably shorter running time compared to state-of-the-art methods while achieving comparable recovery performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI