样本复杂性
二次方程
信号(编程语言)
计算机科学
样品(材料)
数学优化
二次规划
算法
数学
人工智能
物理
几何学
热力学
程序设计语言
标识
DOI:10.1109/tit.2019.2891653
摘要
This paper concerns the problem of recovering an unknown but structured signal ${x}\in \mathbb {R} ^{n}$ from $m$ -quadratic measurements of the form ${y}_{r}= \left |{\langle {a}_{r}, {x}\rangle }\right |^{{2}}$ for ${r}={1},{2},\ldots, {m}$ . We focus on the under-determined setting where the number of measurements is significantly smaller than the dimension of the signal ( ${m} ). We formulate the recovery problem as a nonconvex optimization problem where prior structural information about the signal is enforced through constrains on the optimization variables. We prove the projected gradient descent, when initialized in a neighborhood of the desired signal, converges to the unknown signal at a linear rate. These results hold for any closed constraint set (convex or non-convex) providing convergence guarantees to the global optimum even when the objective function and constraint set are nonconvex. Furthermore, these results hold with a number of measurements that are only a constant factor away from the minimal number of measurements required to uniquely identify the unknown signal. Our results provide the first provably tractable algorithm for this data-poor regime, breaking local sample complexity barriers that have emerged in this paper. In this paper, we utilize and further develop powerful tools for uniform convergence of empirical processes that may have broader implications for rigorous understanding of constrained nonconvex optimization heuristics.
科研通智能强力驱动
Strongly Powered by AbleSci AI