数学
主成分分析
外推法
子空间拓扑
规范(哲学)
应用数学
迭代函数
叠加原理
稳健主成分分析
线性子空间
趋同(经济学)
算法
缩小
数学优化
数学分析
统计
经济
经济增长
法学
政治学
几何学
作者
Peng Wang,Huikang Liu,Anthony Man‐Cho So
摘要
.A popular robust alternative of the classic principal component analysis (PCA) is the \(\ell_1\) -norm PCA (L1-PCA), which aims to find a subspace that captures the most variation in a dataset as measured by the \(\ell_1\) -norm. L1-PCA has shown great promise in alleviating the effect of outliers in data analytic applications. However, it gives rise to a challenging nonsmooth, nonconvex optimization problem, for which existing algorithms are either not scalable or lack strong theoretical guarantees on their convergence behavior. In this paper, we propose a proximal alternating minimization method with extrapolation (PAMe) for solving a two-block reformulation of the L1-PCA problem. We then show that for both the L1-PCA problem and its two-block reformulation, the Kurdyka–Łojasiewicz exponent at any of the limiting critical points is \(1/2\) . This allows us to establish the linear convergence of the sequence of iterates generated by PAMe and to determine the criticality of the limit of the sequence with respect to both the L1-PCA problem and its two-block reformulation. To complement our theoretical development, we show via numerical experiments on both synthetic and real-world datasets that PAMe is competitive with a host of existing methods. Our results not only significantly advance the convergence theory of iterative methods for L1-PCA but also demonstrate the potential of our proposed method in applications.KeywordsL1-PCAKurdyka–Łojasiewicz exponentproximal alternating minimizationextrapolationlinear convergenceMSC codes49J5258C0558C2090C30
科研通智能强力驱动
Strongly Powered by AbleSci AI