特征选择
黑森矩阵
计算机科学
数学优化
人工神经网络
选择(遗传算法)
人工智能
约束(计算机辅助设计)
非线性系统
一致性(知识库)
趋同(经济学)
Lasso(编程语言)
算法
机器学习
数学
应用数学
经济增长
物理
经济
量子力学
几何学
万维网
作者
Yao Chen,Qingyi Gao,Faming Liang,Xiao Wang
标识
DOI:10.1080/10618600.2020.1814305
摘要
This article presents a general framework for high-dimensional nonlinear variable selection using deep neural networks under the framework of supervised learning. The network architecture includes both a selection layer and approximation layers. The problem can be cast as a sparsity-constrained optimization with a sparse parameter in the selection layer and other parameters in the approximation layers. This problem is challenging due to the sparse constraint and the nonconvex optimization. We propose a novel algorithm, called deep feature selection, to estimate both the sparse parameter and the other parameters. Theoretically, we establish the algorithm convergence and the selection consistency when the objective function has a generalized stable restricted Hessian. This result provides theoretical justifications of our method and generalizes known results for high-dimensional linear variable selection. Simulations and real data analysis are conducted to demonstrate the superior performance of our method. Supplementary materials for this article are available online.
科研通智能强力驱动
Strongly Powered by AbleSci AI