极限学习机
变量(数学)
计算机科学
人工智能
数学
数学分析
人工神经网络
作者
Huihuang LU,Weidong Zou,Yuxiang Li
出处
期刊:Journal of Shenzhen University Science and Engineering
[Science Press]
日期:2024-05-01
卷期号:41 (3): 264-273
标识
DOI:10.3724/sp.j.1249.2024.03264
摘要
To address the deficiency of slow convergence rate and stagnation of error decay during later iteration of alternating direction method of multipliers (ADMM) for regularized extreme learning machine (RELM), we propose a dynamic step size ADMM-based RELM algorithm denoted as VAR-ADMM-RELM. This method iterates with dynamically decaying step sizes based on the ADMM algorithm and simultaneously constrains the model complexity using both L1 and L2 regularization, such that the calculated output weight of ELM exhibited greater sparsity and robustness. We conduct fitting, classification, and regression comparative experiments with ELM, RELM, and ADMM-based L1 regularized ELM (ADMM-RELM) on UCI and MedMNIST datasets. The results show that VAR-ADMM-RELM improves the average classification accuracy and average regression prediction by 1.94% and 2.49%, respectively, compared to ELM. It achieves a speedup of 3 to 5 times compared to the standard ADMM algorithm and exhibits better robustness and generalization capabilities against outliers. Furthermore, it approaches the modeling efficiency of standard ELM in high-dimensional multi-sample scenarios. The proposed algorithm effectively enhances the convergence rate of the ADMM algorithm and achieves superior performance compared to mainstream ELM algorithms.
科研通智能强力驱动
Strongly Powered by AbleSci AI