设定值
可转让性
计算机科学
控制系统
变压器
时间序列
过程(计算)
控制工程
机器学习
数据挖掘
人工智能
工程类
电压
罗伊特
电气工程
操作系统
作者
Niranjan Sitapure,Joseph Sang‐Il Kwon
标识
DOI:10.1016/j.compchemeng.2023.108339
摘要
For prediction and real-time control tasks, machine-learning (ML)-based digital twins are frequently employed. However, while these models are typically accurate, they are custom-designed for individual systems, making system-to-system (S2S) transferability difficult. This occurs even when substantial similarities exist in the process dynamics across different chemical systems. To address this challenge, we developed a novel time-series-transformer (TST) framework that exploits the powerful transfer learning capabilities inherent in transformer algorithms. This was demonstrated using readily available process data obtained from different crystallizers operating under various operational scenarios. Using this extensive dataset, we trained a TST model (CrystalGPT) to exhibit remarkable S2S transferability not only across all pre-established systems, but also to an unencountered system. CrystalGPT achieved a cumulative error across all systems, which is eight times superior to that of existing ML models. Additionally, we coupled CrystalGPT with a predictive controller to reduce the variance in setpoint tracking to just 1%.
科研通智能强力驱动
Strongly Powered by AbleSci AI