Location: Dale Bumpers National Rice Research Center
Title: Characterizing variation in rice agronomic performance with multi-temporal UAV imagery and manifold learningAuthor
BELLIS, EMILY - Arkansas State University | |
FARAG, FARED - Arkansas State University | |
Huggins, Trevis | |
Edwards, Jeremy | |
Reba, Michele | |
HASHEM, AHMED - Arkansas State University | |
McClung, Anna |
Submitted to: Rice Technical Working Group Meeting Proceedings
Publication Type: Proceedings Publication Acceptance Date: 1/11/2023 Publication Date: 1/16/2024 Citation: Bellis, E.S., Farag, F., Huggins, T.D., Edwards, J., Reba, M.L., Hashem, A.A., McClung, A.M. 2024. Characterizing variation in rice agronomic performance with multi-temporal UAV imagery and manifold learning. Rice Technical Working Group. Hot Springs, Arkansas, February 20-23, 2023. Meeting Proceedings. Interpretive Summary: Technical Abstract: High-throughput field phenotyping via unoccupied aerial vehicles (UAVs) is a promising technology to accelerate plant improvement, but UAV-based phenotyping has not yet been widely deployed in rice breeding programs. Compared to manual trait characterization, UAV-based phenotypes can be captured more rapidly, in larger numbers, at higher resolution, and could replace some subjective measurements. We evaluated the utility of UAVs for accurate rice phenotype characterization across four experiments replicated in two years at the USDA/ARS National Rice Research Center in Stuttgart, AR. Research plots in the four experiments (n = 252) comprised a range of experimental conditions, including four nitrogen rates, two seeding rates, and 21 diverse genotypes. Multispectral images (blue [475 nm], green [560 nm], red [668 nm], red edge [717 nm], near-infrared [840 nm], and thermal [11,000 nm] bands) were collected approximately every 10 days, at 11-12 time points across the growing season. Models based solely on derived image features (p = 50 vegetation indices and image texture features) were strongly predictive of variation in rice phenotypic variation including days to 25 percent, 50 percent, and 100 percent heading (R2 = 0.92, 0.91, and 0.90, respectively), leaf temperature (R2 = 0.86), yield (R2 = 0.84), percent lodging (R2 = 0.81), and percent stand (R2 = 0.62), though predictive accuracy was lower for chlorophyll meter readings (R2 = 0.24). Automated feature selection using the Least Absolute Shrinkage and Selection Operator (LASSO) regression indicated that best performing models typically used information from multiple image features and at least half of all available time points. However, vegetation index trajectories differed considerably between 2021 and 2022, challenging the transferability of models trained in 2021. To improve transferability across years, we developed an unsupervised machine learning approach based on manifold learning to automatically account for phenological variation in multispectral, multitemporal UAV images. Our results suggest that UAVs capture useful information for rice phenotypic characterization, which can be further leveraged using modern deep learning techniques. Genetic associations with temporal phenotypes from image embeddings represent an additional benefit of using UAV-based approaches to characterize high-dimensional phenotypes in breeding programs. |