偏最小二乘回归
近红外光谱
均方误差
生物系统
残余物
数学
相关系数
二阶导数
模式识别(心理学)
人工智能
计算机科学
统计
算法
物理
光学
生物
数学分析
作者
Chenlong Fan,Ying Liu,Tao Cui,Mengmeng Qiao,Yu Yang,Weijun Xie,Yuping Huang
出处
期刊:Foods
[Multidisciplinary Digital Publishing Institute]
日期:2024-12-23
卷期号:13 (24): 4173-4173
被引量:3
标识
DOI:10.3390/foods13244173
摘要
Rapid and accurate detection of protein content is essential for ensuring the quality of maize. Near-infrared spectroscopy (NIR) technology faces limitations due to surface effects and sample homogeneity issues when measuring the protein content of whole maize grains. Focusing on maize grain powder can significantly improve the quality of data and the accuracy of model predictions. This study aims to explore a rapid detection method for protein content in maize grain powder based on near-infrared spectroscopy. A method for determining protein content in maize grain powder was established using near-infrared (NIR) reflectance spectra in the 940–1660 nm range. Various preprocessing techniques, including Savitzky−Golay (S−G), multiplicative scatter correction (MSC), standard normal variate (SNV), and the first derivative (1D), were employed to preprocess the raw spectral data. Near-infrared spectral data from different varieties of maize grain powder were collected, and quantitative analysis of protein content was conducted using Partial Least Squares Regression (PLSR), Support Vector Machine (SVM), and Extreme Learning Machine (ELM) models. Feature wavelengths were selected to enhance model accuracy further using the Successive Projections Algorithm (SPA) and Uninformative Variable Elimination (UVE). Experimental results indicated that the PLSR model, preprocessed with 1D + MSC, yielded the best performance, achieving a root mean square error of prediction (RMSEP) of 0.3 g/kg, a correlation coefficient (Rp) of 0.93, and a residual predictive deviation (RPD) of 3. The associated methods and theoretical foundation provide a scientific basis for the quality control and processing of maize.
科研通智能强力驱动
Strongly Powered by AbleSci AI