初始化
非线性系统
加速度
梯度下降
人工神经网络
数学
张量(固有定义)
估计员
功能(生物学)
应用数学
算法
数学优化
函数逼近
计算机科学
缩小
反向传播
领域(数学分析)
合成数据
可扩展性
因式分解
人工智能
面子(社会学概念)
反问题
随机梯度下降算法
独特性
估计理论
概率密度函数
随机场
正规化(语言学)
作者
Wenqiang Li,Ming‐Wei Lin,Xiuqin Xu,Lin Ling,Zeshui Xu,Xin Luo
标识
DOI:10.1109/tsmc.2025.3622727
摘要
The traditional nonnegative latent factorization of tensors (NLFTs) models can effectively represent high-dimensional and incomplete (HDI) tensors, but they currently face two main problems: 1) existing models are linear and cannot capture the nonlinear features of HDI tensors and 2) they rely on random initialization of nonnegative parameter and constraint-combined training schemes. To address these issues, this article proposes a neural NLFTs model with acceleration and unconstraint. The main ideas are given as follows: 1) utilizing a neural network (NN) structure and a nonlinear activation function to capture the nonlinear features within the HDI tensor accurately; 2) constructing a nonnegative mapping domain that transfers nonnegativity constraints from latent factors (LFs) to output decision parameters via a single-element-dependent mapping function, enabling an unconstrained optimization framework; and 3) utilizing the highly compatible momentum-incorporated stochastic gradient descent (SGD) algorithm as the backward propagation (BP) learning scheme of the model, which not only ensures training effectiveness and scalability but also accelerates convergence. Empirical studies on ten HDI tensors demonstrate that the proposed model achieves impressive estimation accuracy and per-iteration time cost compared to state-of-the-art models.
科研通智能强力驱动
Strongly Powered by AbleSci AI