Yu Yao, Luyu Zhao, Xiang Gao, Hengbin Wang, Junyi Liu, Xiaodong Zhang, Yuanyuan Zhao, Shaoming Li, Zhe Liu
Abstract
Leaf area index (LAI) is a key indicator for measuring crop photosynthesis and growth status. In the monitoring of winter wheat LAI at the scale of large unmanned farms, satellite imagery is stable but lacks spatial resolution for precise monitoring, while UAV imagery is spatially detailed but prone to weather interference, resulting in poor spectral data consistency. To address this, this study took winter wheat in a large unmanned farm in Zouping City, Shandong Province as the research object and proposed a "coarse-fine fusion" two-step fusion method for UAV and Sentinel-2 imagery. Based on the fusion results, a feature set for winter wheat LAI inversion was constructed, and the SHAP model was used to evaluate feature contributions and screen the optimal combination. Machine learning models such as XGBoost and Random Forest were employed to invert LAI at key growth stages of winter wheat under different data fusion modes. Model hyperparameters were optimized through grid search to analyze the impact of data fusion methods on LAI inversion across growth stages. Experiments showed that the two-step fusion method significantly improved spectral consistency and accuracy, with the correlation coefficient between fusion results and Sentinel-2 NDVI values reaching 0.82. Nine key features were selected for model construction, among which the near-infrared band and plant height showed high positive contributions to LAI inversion. Under the two-step fusion data mode, the Random Forest algorithm performed best, achieving an overall R² of 0.895, MAE of 0.216 m²/m², and RMSE of 0.295 m²/m². Inversion accuracy varied across growth stages, with R² of 0.86 during the jointing stage, accurately reflecting dynamic LAI changes. This study provides an efficient and feasible solution for precision monitoring of large-area winter wheat, supporting precision agriculture and food security.