激活函数
人工神经网络
稳健性(进化)
趋同(经济学)
数学
基质(化学分析)
应用数学
计算机科学
数学优化
控制理论(社会学)
人工智能
材料科学
控制(管理)
经济
复合材料
经济增长
生物化学
化学
基因
作者
Bin Chai,Ke Zhang,Minghu Tan,Jingyu Wang
标识
DOI:10.1080/00207160.2023.2170178
摘要
Zeroing neural network offers a new solution method to solve the time-varying linear matrix equation. As an important component, the activation function directly affects the performance of zeroing neural network in solving time-varying linear matrix equation. Focusing on the unification of prescribed time convergence and strong robustness without changing the basic structure of zeroing neural network, a novel activation function is proposed for the first time in this paper. Compared with commonly used activation functions in previous work, the novel activation function has superior performance for zeroing neural network to solve the time-varying linear matrix equation. The first item is global asymptotic convergence, capable of converging from random initial states to the theoretical solution. The second item is the prescribed time convergence, i.e. the upper bound of convergence time is only related to the parameters of the novel activation function and zeroing neural network, which facilitates the prediction of the convergence process. The third item is strong robustness, which ensures that the solution converges in various noisy environments. Theoretical analysis and comparative simulation experiments verify that zeroing neural network with the novel activation function has these performances for both low- or high-dimensional time-varying linear matrix equation.
科研通智能强力驱动
Strongly Powered by AbleSci AI