子空间拓扑
张量(固有定义)
奇异值分解
秩(图论)
计算机科学
趋同(经济学)
算法
主成分分析
排
数学
数学优化
人工智能
纯数学
组合数学
数据库
经济
经济增长
作者
Weichao Kong,Feng Zhang,Wenjin Qin,Jianjun Wang
标识
DOI:10.1016/j.patcog.2023.109545
摘要
Currently, low-rank tensor recovery employing the subspace prior information is an emerging topic, which has attracted considerable attention. However, existing studies cannot flexibly and fully utilize the accessible subspace prior information, thereby leading to suboptimal restored performance. Aiming at addressing this issue, based on the tensor singular value decomposition (t-SVD), this article presents a novel strategy that integrates more than two layers of subspace knowledge about columns and rows of target tensor into one unified recovery framework. Specially, we first design a multilayer subspace prior learning scheme, and then apply it to two common low-rank tensor recovery problems, i.e., tensor completion and tensor robust component principal analysis. Crucially, we prove that our approach can achieve exact recovery of tensors under a significantly weaker incoherence assumption than the analogous conditions previously proposed. Furthermore, two efficient algorithms with convergence guarantees based on alternating direction method of multipliers (ADMM) are proposed to solve the corresponding models. The experimental results on synthetic and real tensor data show that the proposed algorithms outperform other state-of-the-art algorithms in terms of both qualitative and quantitative metrics.
科研通智能强力驱动
Strongly Powered by AbleSci AI