坐标下降
收敛速度
应用数学
秩(图论)
有界函数
数学
力矩(物理)
数学优化
噪音(视频)
高斯分布
高斯噪声
一致性(知识库)
算法
计算机科学
数学分析
组合数学
离散数学
人工智能
物理
图像(数学)
频道(广播)
经典力学
量子力学
计算机网络
作者
Kean Ming Tan,Qiang Sun,Daniela Witten
标识
DOI:10.1080/01621459.2022.2050243
摘要
We propose a sparse reduced rank Huber regression for analyzing large and complex high-dimensional data with heavy-tailed random noise. The proposed method is based on a convex relaxation of a rank- and sparsity-constrained nonconvex optimization problem, which is then solved using a block coordinate descent and an alternating direction method of multipliers algorithm. We establish nonasymptotic estimation error bounds under both Frobenius and nuclear norms in the high-dimensional setting. This is a major contribution over existing results in reduced rank regression, which mainly focus on rank selection and prediction consistency. Our theoretical results quantify the tradeoff between heavy-tailedness of the random noise and statistical bias. For random noise with bounded (1+δ)th moment with δ∈(0,1), the rate of convergence is a function of δ, and is slower than the sub-Gaussian-type deviation bounds; for random noise with bounded second moment, we obtain a rate of convergence as if sub-Gaussian noise were assumed. We illustrate the performance of the proposed method via extensive numerical studies and a data application. Supplementary materials for this article are available online.
科研通智能强力驱动
Strongly Powered by AbleSci AI