强化学习
计算机科学
钢筋
人工智能
材料科学
复合材料
作者
Yong Wang,Jingda Wu,Hongwen He,Zhongbao Wei,Fengchun Sun
标识
DOI:10.1038/s41467-025-58192-9
摘要
Energy management technologies have significant potential to optimize electric vehicle performance and support global energy sustainability. However, despite extensive research, their real-world application remains limited due to reliance on simulations, which often fail to bridge the gap between theory and practice. This study introduces a real-world data-driven energy management framework based on offline reinforcement learning. By leveraging electric vehicle operation data, the proposed approach eliminates the need for manually designed rules or reliance on high-fidelity simulations. It integrates seamlessly into existing frameworks, enhancing performance after deployment. The method is tested on fuel cell electric vehicles, optimizing energy consumption and reducing system degradation. Real-world data from an electric vehicle monitoring system in China validate its effectiveness. The results demonstrate that the proposed method consistently achieves superior performance under diverse conditions. Notably, with increasing data availability, performance improves significantly, from 88% to 98.6% of the theoretical optimum after two updates. Training on over 60 million kilometers of data enables the learning agent to generalize across previously unseen and corner-case scenarios. These findings highlight the potential of data-driven methods to enhance energy efficiency and vehicle longevity through large-scale vehicle data utilization. Energy management technologies for electric vehicles often rely on manual design and simulations, limiting real-world application. Here, authors introduce a data-driven offline reinforcement learning framework that optimizes energy consumption and system degradation using historical data, achieving improved performance and adaptability.
科研通智能强力驱动
Strongly Powered by AbleSci AI