协变量
超参数
计算机科学
油藏计算
核(代数)
多项式的
普遍性(动力系统)
算法
数学优化
人工智能
机器学习
应用数学
理论计算机科学
数学
离散数学
循环神经网络
人工神经网络
数学分析
物理
量子力学
作者
Lyudmila Grigoryeva,Hannah Lim Jing Ting,Juan‐Pablo Ortega
出处
期刊:Physical review
[American Physical Society]
日期:2025-03-14
卷期号:111 (3)
标识
DOI:10.1103/physreve.111.035305
摘要
Next-generation reservoir computing (NG-RC) has attracted much attention due to its excellent performance in spatiotemporal forecasting of complex systems and its ease of implementation. This paper shows that NG-RC can be encoded as a kernel ridge regression that makes training efficient and feasible even when the space of chosen polynomial features is very large. Additionally, an extension to an infinite number of covariates is possible, which makes the methodology agnostic with respect to the lags into the past that are considered as explanatory factors, as well as with respect to the number of polynomial covariates, an important hyperparameter in traditional NG-RC. We show that this approach has solid theoretical backing and good behavior based on kernel universality properties previously established in the literature. Various numerical illustrations show that these generalizations of NG-RC outperform the traditional approach in several forecasting applications.
科研通智能强力驱动
Strongly Powered by AbleSci AI