切片逆回归
极小极大
还原(数学)
维数(图论)
非线性系统
足够的尺寸缩减
应用数学
分歧(语言学)
鞅(概率论)
数学
人工神经网络
数学优化
降维
人工智能
计算机科学
物理
纯数学
几何学
量子力学
语言学
哲学
作者
Yinfeng Chen,Yuling Jiao,Rui Qiu,Yu Zhou
摘要
Linear sufficient dimension reduction, as exemplified by sliced inverse regression, has seen substantial development in the past thirty years. However, with the advent of more complex scenarios, nonlinear dimension reduction has gained considerable interest recently. This paper introduces a novel method for nonlinear sufficient dimension reduction, utilizing the generalized martingale difference divergence measure in conjunction with deep neural networks. The optimal solution of the proposed objective function is shown to be unbiased at the general level of σ-fields. And two optimization schemes, based on the fascinating deep neural networks, exhibit higher efficiency and flexibility compared to the classical eigendecomposition of linear operators. Moreover, we systematically investigate the slow rate and fast rate for the estimation error based on advanced U-process theory. Remarkably, the fast rate almost coincides with the minimax rate of nonparametric regression. The validity of our deep nonlinear sufficient dimension reduction methods is demonstrated through simulations and real data analysis.
科研通智能强力驱动
Strongly Powered by AbleSci AI