Estimating leaf area index using unmanned aerial vehicle data: shallow vs. deep machine learning algorithms
Shuaibing Liu, Xiuliang Jin, Chenwei Nie, Siyu Wang, Xun Yu, Minghan Cheng, Mingchao Shao, Zixu Wang, Nuremanguli Tuohuti, Yi Bai, Yadong Liu
Plant Physiology, IF: 7.394
https://doi.org/10.1093/plphys/kiab322
Abstract
Measuring leaf area index (LAI) is essential for evaluating crop growth and estimating yield, thereby facilitating high-throughput phenotyping of maize (Zea mays ). LAI estimation models use multi-source data from unmanned aerial vehicles (UAVs), but using multimodal data to estimate maize LAI, and the effect of tassels and soil background, remain understudied. Our research aims to (1) determine how multimodal data contribute to LAI and propose a framework for estimating LAI based on remote-sensing data, (2) evaluate the robustness and adaptability of an LAI estimation model that uses multimodal data fusion and deep neural networks (DNNs) in single- and whole growth stages, and (3) explore how soil background and maize tasseling affect LAI estimation. To construct multimodal datasets, our UAV collected red–green–blue, multispectral, and thermal infrared images. We then developed partial least square regression (PLSR), support vector regression, and random forest regression models to estimate LAI. We also developed a deep learning model with three hidden layers. This multimodal data structure accurately estimated maize LAI. The DNN model provided the best estimate (coefficient of determination [R 2] =" 0.89," relative root mean square error [rRMSE] =" 12.92%)" for a single growth period, and the PLSR model provided the best estimate (R 2 =" 0.70," rRMSE =" 12.78%)" for a whole growth period. Tassels reduced the accuracy of LAI estimation, but the soil background provided additional image feature information, improving accuracy. These results indicate that multimodal data fusion using low-cost UAVs and DNNs can accurately and reliably estimate LAI for crops, which is valuable for high-throughput phenotyping and high-spatial precision farmland management.