稳健主成分分析
奇异值分解
矩阵范数
稳健性(进化)
矩阵完成
奇异值
数学优化
秩(图论)
计算
缩小
计算机科学
矩阵分解
低秩近似
规范(哲学)
最优化问题
凸优化
数学
算法
人工智能
主成分分析
正多边形
组合数学
化学
物理
特征向量
汉克尔矩阵
基因
政治学
几何学
生物化学
数学分析
法学
量子力学
高斯分布
作者
Yuanyuan Liu,Licheng Jiao,Fanhua Shang
标识
DOI:10.1016/j.patcog.2012.07.003
摘要
In recent years, matrix rank minimization problems have received a significant amount of attention in machine learning, data mining and computer vision communities. And these problems can be solved by a convex relaxation of the rank minimization problem which minimizes the nuclear norm instead of the rank of the matrix, and has to be solved iteratively and involves singular value decomposition (SVD) at each iteration. Therefore, those algorithms for nuclear norm minimization problems suffer from high computation cost of multiple SVDs. In this paper, we propose a Fast Tri-Factorization (FTF) method to approximate the nuclear norm minimization problem and mitigate the computation cost of performing SVDs. The proposed FTF method can be used to reliably solve a wide range of low-rank matrix recovery and completion problems such as robust principal component analysis (RPCA), low-rank representation (LRR) and low-rank matrix completion (MC). We also present three specific models for RPCA, LRR and MC problems, respectively. Moreover, we develop two alternating direction method (ADM) based iterative algorithms for solving the above three problems. Experimental results on a variety of synthetic and real-world data sets validate the efficiency, robustness and effectiveness of our FTF method comparing with the state-of-the-art nuclear norm minimization algorithms.
科研通智能强力驱动
Strongly Powered by AbleSci AI