计算机科学
卷积神经网络
人工智能
直方图
农业
机器学习
深度学习
定向梯度直方图
病虫害综合治理
农业工程
粮食安全
分类器(UML)
生态学
工程类
生物
图像(数学)
作者
Salman Ahmed,Safdar Nawaz Khan Marwat,Ghassen Ben Brahim,Waseem Ullah Khan,Shahid Nawaz Khan,Ala Al‐Fuqaha,Sławomir Kozieł
标识
DOI:10.1038/s41598-024-83012-3
摘要
Despite seemingly inexorable imminent risks of food insecurity that hang over the world, especially in developing countries like Pakistan where traditional agricultural methods are being followed, there still are opportunities created by technology that can help us steer clear of food crisis threats in upcoming years. At present, the agricultural sector worldwide is rapidly pacing towards technology-driven Precision Agriculture (PA) approaches for enhancing crop protection and boosting productivity. Literature highlights the limitations of traditional approaches such as chances of human error in recognizing and counting pests, and require trained labor. Against such a backdrop, this paper proposes a smart IoT-based pest detection platform for integrated pest management, and monitoring crop field conditions that are of crucial help to farmers in real field environments. The proposed system comprises a physical prototype of a smart insect trap equipped with embedded computing to detect and classify pests. To this aim, a dataset was created featuring images of oriental fruit flies captured under varying illumination conditions in guava orchards. The size of the dataset is 1000+ images categorized into two groups: (1) fruit fly and (2) not fruit fly and a convolutional neural network (CNN) classifier was trained based on the following features: (1) Haralick features (2) Histogram of oriented gradients (3) Hu moments and (4) Color histogram. The system achieved a recall value of 86.2% for real test images with Mean Average Precision (mAP) of 97.3%. Additionally, the proposed model has been compared with numerous machine learning (ML) and deep learning (DL) based models to verify the efficacy of the proposed model. The comparative results indicated that the best performance was achieved by the proposed model with the highest accuracy, precision, recall, F1-score, specificity, and FNR with values of 97.5%, 92.82%, 98.92%, 95.00%, 95.90%, and 5.88% respectively.
科研通智能强力驱动
Strongly Powered by AbleSci AI