计算机科学
控制器(灌溉)
云计算
GSM演进的增强数据速率
边缘计算
混乱的边缘
算法
人工智能
操作系统
农学
生物
作者
Robert M. Kent,Wendson A. S. Barbosa,Daniel J. Gauthier
标识
DOI:10.1038/s41467-024-48133-3
摘要
Abstract Machine learning provides a data-driven approach for creating a digital twin of a system – a digital model used to predict the system behavior. Having an accurate digital twin can drive many applications, such as controlling autonomous systems. Often, the size, weight, and power consumption of the digital twin or related controller must be minimized, ideally realized on embedded computing hardware that can operate without a cloud-computing connection. Here, we show that a nonlinear controller based on next-generation reservoir computing can tackle a difficult control problem: controlling a chaotic system to an arbitrary time-dependent state. The model is accurate, yet it is small enough to be evaluated on a field-programmable gate array typically found in embedded devices. Furthermore, the model only requires 25.0 $$\pm 7.0$$ ± 7.0 nJ per evaluation, well below other algorithms, even without systematic power optimization. Our work represents the first step in deploying efficient machine learning algorithms to the computing “edge.”
科研通智能强力驱动
Strongly Powered by AbleSci AI