Location: Dale Bumpers National Rice Research Center
Title: Manifold and spatiotemporal learning on multispectral unoccupied aerial system imagery for phenotype predictionAuthor
![]() |
FARAG, FARED - Arkansas State University |
![]() |
Huggins, Trevis |
![]() |
Edwards, Jeremy |
![]() |
MCCLUNG, ANNA - Retired ARS Employee |
![]() |
HASHEM, AHMED - University Of Arkansas |
![]() |
CAUSEY, JASON - Arkansas State University |
![]() |
BELLIS, EMILY - Arkansas State University |
Submitted to: The Plant Phenome Journal
Publication Type: Peer Reviewed Journal Publication Acceptance Date: 10/8/2024 Publication Date: 11/3/2024 Citation: Farag, F., Huggins, T.D., Edwards, J., McClung, A., Hashem, A., Causey, J., Bellis, E.S. 2024. Manifold and spatiotemporal learning on multispectral unoccupied aerial system imagery for phenotype prediction. The Plant Phenome Journal. https://doi.org/10.1002/ppj2.70006. DOI: https://doi.org/10.1002/ppj2.70006 Interpretive Summary: Unoccupied aircraft systems (UASs) are increasingly utilized in agriculture to predict plant traits from remotely-sensed imagery. These UAS datasets are collected at multiple times and across multiple field locations, which presents challenges in developing prediction models that generalize well across different years and environments. The poor transferability of trained models from one agricultural context to another hampers the widespread adoption of UAS technology. This study aimed to evaluate and develop advanced machine learning approaches for robust and accurate plant trait prediction solely from UAS imagery. Over a two-year field study, a series of five experiments were conducted, involving two nitrogen rates, two seeding rates, hybrid and inbred varieties, and a genetic diversity study. Licensed and certified drone pilots collected UAS imagery at multiple timepoints during the 2021 and 2022 growing seasons. Upon harvest, grain yield was determined for each plot. The study revealed that dimension reduction approaches for machine learning could enhance yield prediction and improve generalization to new contexts and growing seasons. Only models capable of learning higher-order interactions yielded robust results for yield prediction across different growing seasons, while simpler models performed well when trained and tested on the same season and cultivars. These findings extend the toolkit for UAS image analysis, aiding improved trait prediction in plant breeding and precision agriculture. The findings of this study are valuable to plant breeders and producers, offering promising avenues for refining predictive capabilities using remote-sensing data. The results advance the field of remote sensing by providing a new benchmark dataset for rice, which the research community can use to evaluate new machine learning methods. The study highlights the potential of UAS applications in plant breeding and benefits rice producers by generating better yield maps for commercial fields. Furthermore, the new methods developed through this research will accelerate the genetic improvement of rice, enhancing overall crop yield leading to higher profits for producers. Technical Abstract: Timeseries data captured by unoccupied aircraft systems (UASs) are increasingly used for agricultural applications requiring accurate prediction of plant phenotypes from remotely-sensed imagery. However, prediction models often fail to generalize well from one year to the next or to new environments. Here, we investigate the ability of various machine learning approaches to improve yield prediction accuracy in new environments, from multispectral timeseries imagery acquired on a set of rice experiments with different management treatments and varieties. We trained deep learning models that perform automated feature extraction and compared these against a suite of other approaches. Across all tested methods, the best performance on a held-out growing season was for a spatiotemporal model (a 3D convolutional neural network [3D-CNN]) trained on raw images. Methods based on dimension reduction of temporal imagery, using either principal components analysis or manifold learning, also generalized well but required manual extraction of image features (i.e. vegetation indices and image texture properties) to achieve performant results for yield prediction. Manifold learning on raw imagery was better suited for the prediction of phenological traits, due to the preservation of local structure in image embeddings at some timepoints. Together with a new benchmark dataset for rice, our results help extend the toolkit for UAS image analysis, contributing to improved phenotype prediction in plant breeding and precision agriculture applications. |