格子(音乐)
代表(政治)
谱线
章节(排版)
焊剂(冶金)
材料科学
物理
计算机科学
声学
量子力学
政治学
政治
操作系统
冶金
法学
作者
Yi Meng Chan,Jan Dufek
标识
DOI:10.1016/j.anucene.2024.110746
摘要
To compute few-group nodal data, lattice codes first need to generate multi-group cross-sections for each constituent material within the lattice model. This generation process utilizes continuous-energy cross-section libraries, which is expensive in terms of the computing cost. Moreover, any alteration in the nuclide compositions or other state parameters necessitates the repetition of this process. To reduce the computational demands, we propose the application of a pre-trained representational model. This model, which integrates Deep Neural Networks (DNN) and Principal Component Analysis (PCA) modules, is particularly beneficial in scenarios that require repeated multi-group data processing by the lattice code. In our previous research, we established that such a model could accurately generate multi-group data for fuel pellet materials. In the present study, we have broadened the scope of the model to encompass a more extensive range of materials typically found in pressurized water reactors, including zirc-alloy cladding and borated water moderators. We also show that the model can be trained on a wide spectrum of fuel enrichments. When integrated into lattice calculations, the errors introduced by the deep-learning-based representational model result in less than 1% deviation in the keff and pin-power distribution. We have further refined the model to assess also the neutron fluxes in the fuel pellet and borated water. This refined model was subsequently employed to perform a flux-weighted collapse and generate few-group cross-section libraries for lattice calculation. The few-group libraries generated in this manner exhibited high accuracy and gave a low average keff error and minimal errors in pin power distribution.
科研通智能强力驱动
Strongly Powered by AbleSci AI