作者
Ziheng Feng,Jiliang Zhao,Liunan Suo,Heguang Sun,Huiling Long,Hao Yang,Xiaoyu Song,Haikuan Feng,Bo Xu,Guijun Yang,Chunjiang Zhao
摘要
Near real–time maize phenology monitoring is crucial for field management, cropping system adjustments, and yield estimation. Most phenological monitoring methods are post–seasonal and heavily rely on high–frequency time–series data. These methods are not applicable on the unmanned aerial vehicle (UAV) platform due to the high cost of acquiring time–series UAV images and the shortage of UAV–based phenological monitoring methods. To address these challenges, we employed the Synthetic Minority Oversampling Technique (SMOTE) for sample augmentation, aiming to resolve the small sample modelling problem. Moreover, we utilized enhanced “separation” and “compactness” feature selection methods to identify input features from multiple data sources. In this process, we incorporated dynamic multi–source data fusion strategies, involving Vegetation index (VI), Color index (CI), and Texture features (TF). A two–stage neural network that combines Convolutional Neural Network (CNN) and Long Short–Term Memory Network (LSTM) is proposed to identify maize phenological stages (including sowing, seedling, jointing, trumpet, tasseling, maturity, and harvesting) on UAV platforms. The results indicate that the dataset generated by SMOTE closely resembles the measured dataset. Among dynamic data fusion strategies, the VI–TF combination proves to be most effective, with CI–TF and VI–CI combinations following behind. Notably, as more data sources are integrated, the model’s demand for input features experiences a significant decline. In particular, the CNN–LSTM model, based on the fusion of three data sources, exhibited remarkable reliability when validating the three datasets. For Dataset 1 (Beijing Xiaotangshan, 2023: Data from 12 UAV Flight Missions), the model achieved an overall accuracy (OA) of 86.53%. Additionally, its precision (Pre), recall (Rec), F1 score (F1), false acceptance rate (FAR), and false rejection rate (FRR) were 0.89, 0.89, 0.87, 0.11, and 0.11, respectively. The model also showed strong generalizability in Dataset 2 (Beijing Xiaotangshan, 2023: Data from 6 UAV Flight Missions) and Dataset 3 (Beijing Xiaotangshan, 2022: Data from 4 UAV Flight Missions), with OAs of 89.4% and 85%, respectively. Meanwhile, the model has a low demand for input features, requiring only 54.55% (99 of all features). The findings of this study not only offer novel insights into near real–time crop phenology monitoring, but also provide technical support for agricultural field management and cropping system adaptation.