计算机科学
方向(向量空间)
纤维
卷积神经网络
人工智能
分割
体积热力学
转动惯量
力矩(物理)
计算
基本事实
材料科学
算法
模式识别(心理学)
复合材料
几何学
数学
物理
量子力学
经典力学
作者
Patrick Bleiziffer,Jürgen Hofmann,Robert Zboray,Thorsten Wiege,R. Herger
标识
DOI:10.1016/j.engappai.2021.104351
摘要
The mechanical properties of glass fiber reinforced polymers (GFRP) are significantly governed by the orientation of the fibers in the composite. Micro X-ray computed tomography (CT) imaging offers a way of determining the fiber’s orientation in a non-destructive fashion. Various approaches have been presented to compute the direction of the fibers based on fiber tracking or weighted volume algorithms. In this work we present two novel approaches, one employing convolutional neural networks (CNNs) and the other directional analysis by inertia tensor evaluation (ITE). We establish a workflow based on molecular dynamics simulations to efficiently create synthetic training data for the CNN. The two methods are applied to two experimental CT scans of the GFRP polyamide 66 of two different components, featuring different CT resolutions, fiber lengths and volume fractions. The CNN model trained by synthetic data predicts fiber orientations consistently and with similar accuracy as best-in-class commercially available products. We observe an increase in computational speeds of at least a factor of 4 on CPUs or about a factor of 50 on GPUs, respectively. A striking feature of this approach is that the ground truth of our training data is perfectly known and no time-consuming manual labeling of fibers for training is needed. The proposed ITE method is very robust and particularly suited to lower resolution CT scans, as the evaluation of gradients is not necessary. Both methods extend the toolbox of weighted volume approaches and are well-suited to predict orientations of densely-packed fibers that are often encountered in industrial practice. In addition, predictions by the trained CNN model can be run on standard office hardware which makes them particularly interesting for industrial environments.
科研通智能强力驱动
Strongly Powered by AbleSci AI