纤锌矿晶体结构
震级(天文学)
钥匙(锁)
材料科学
光电子学
计算机科学
物理
计算机安全
天文
冶金
锌
作者
Yao Kang,Jian Chen,Yi Tong,Xinpeng Wang,Kun Duan,Jiaqi Wang,Xudong Wang,Dayu Zhou,Man Yao
出处
期刊:Chinese Physics
[Acta Physica Sinica, Chinese Physical Society and Institute of Physics, Chinese Academy of Sciences]
日期:2024-12-04
卷期号:74 (2): 027701-027701
标识
DOI:10.7498/aps.74.20241520
摘要
<sec>Emerging wurtzite ferroelectric materials have aroused significant interest due to their high spontaneous polarization magnitude (<i>P</i><sub>s</sub>). However, there is a limited understanding of the key factors that influence <i>P</i><sub>s</sub>. Herein, a machine-learning regression model is developed to predict the <i>P</i><sub>s</sub> using a dataset consisting of 40 binary and 89 simple ternary wurtzite materials. Features are extracted based on elemental properties, crystal parameters and electronic properties. Feature selection is carried out using the Boruta algorithm and distance correlation analysis, resulting in a comprehensive machine learning model. Furthermore, SHapley Additive exPlanations analysis identifies the average cation-ion potential (IPi_Aave) and the lattice parameter (<i>a</i>) as significant determinants of <i>P</i><sub>s</sub>, with IPi_Aave having the most prominent effect. A lower IPi_Aave corresponds to a lower <i>P</i><sub>s</sub> in the material. Additionally, <i>a</i> exhibits an approximately negative correlation with <i>P</i><sub>s</sub>.</sec><sec>This multifactorial analysis fills the existing gap in understanding the determinants of <i>P</i><sub>s</sub>, and makes a foundational contribution to the evaluating emerging wurtzite materials and expediting the discovery of high-performance ferroelectric materials.</sec><sec>The dataset in this work can be accessed in the Scientific Data Bank <ext-link ext-link-type="uri" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="https://www.doi.org/10.57760/sciencedb.j00213.00073">https://www.doi.org/10.57760/sciencedb.j00213.00073</ext-link>.</sec>
科研通智能强力驱动
Strongly Powered by AbleSci AI