构造(python库)
计算机科学
背景(考古学)
模式识别(心理学)
块(置换群论)
任务(项目管理)
领域(数学)
数据挖掘
比例(比率)
机制(生物学)
人工智能
特征(语言学)
数学
管理
几何学
语言学
量子力学
纯数学
程序设计语言
经济
生物
古生物学
哲学
物理
认识论
作者
Ming Tang,Yuanyuan Li,Wei Yao,Lingyu Hou,Qichun Sun,Jiahang Chen
标识
DOI:10.1088/1361-6501/ac0ca8
摘要
In industry, defect detection involves two kinds of tasks: defect classification and location, which make it difficult to ensure the accuracy of both, and also make the task still challenging in practical application. Based on the analysis of the advantages and disadvantages of the current defect detection method, this paper proposes a defect detection method based on attention mechanism and multi-scale maxpooling (MSMP). In order to effectively improve the detection accuracy of the model, we use Resnet50 as the pre-training network construct two-stage detection model which is used to be the baseline network, and introduce the attention mechanism and MSMP module on this basis. The attention mechanism can enhance the features of the feature map extracted in each stage of Resnet50, so that the network concentrates on the effective areas for the final detection results, and ignores the background areas that are invalid or even unfavorable for detection. The proposed MSMP can incrementally enhance the receptive field, distinguish the most significant context features, and effectively improve detection precision. The proposed method is used to train and test on the NEU-DET dataset. Compared with the baseline network without any improvement, the proposed method in this paper achieves 3.65% mAP performance improvement. Meanwhile, our method achieves a performance improvement of 3.65% mAP. In addition, compared with the feature fusion mechanism, our method improves 4.03% mAP. Moreover, compared with the attention mechanisms such as spatial attention and SE block, our method improves 1.51%/1.03% mAP. Furthermore, compared with the one-stage detection algorithm SSD/YOLO-V4, the proposed method improves 5.01%/4.92% mAP. In addition, the classification accuracy of our model is as high as 94.73%.
科研通智能强力驱动
Strongly Powered by AbleSci AI