|HASHEM, AHMED - Arkansas State University|
|BELLIS, EMILY - Arkansas State University|
|FARAQ, FARED - Arkansas State University|
Submitted to: Meeting Abstract
Publication Type: Abstract Only
Publication Acceptance Date: 8/1/2022
Publication Date: 11/10/2022
Citation: McClung, A.M., Hashem, A., Bellis, E., Huggins, T.D., Faraq, F., Reba, M.L., Edwards, J. 2022. Use of spectral imaging to predict agronomic performance in research and commercial rice fields. Meeting Abstract. ASA-CSSA-SSSA International Annual Meeting, Baltimore, Maryland.
Technical Abstract: Sensor technologies are evolving for use in monitoring and predicting crop productivity. These sensors can generate various vegetation indices that are indicators of plant health and nutritional status. However, the accuracy of these indices can be affected by different rice varieties (genetics), soil variability, management practices, and environmental conditions. Imaging of research plots can reveal how the relationship between vegetation indices and plant productivity changes due to these factors, but the extent to which information at plot scale translates to a whole field is poorly known. Multispectral imagery was collected on a weekly basis from five replicated research studies conducted in Stuttgart, AR during 2021 to determine relationships with agronomic trait data. Plots were evaluated for: % stand, days to harvest, % lodging, yield, milling yield, and grain dimensions with plant height, days to heading, chlorophyll content, and leaf temperature being measured several times through the season on a subset of the plots. For comparison with the research plot study in Stuttgart, similar agronomic and multi-spectral imaging data was collected on a nearby commercial field. An unmanned aircraft system (UAS) was equipped with a multispectral and thermal sensor to collect five multispectral bands: Blue (475 nm), green (560 nm), red (668 nm), red edge (717 nm), near-IR (842 nm), and a radiometrically calibrated thermal infrared band (8-14um). A total of 23 vegetation indices and thermal values were determined per flight and we also included rice genotype, growth stage, nitrogen treatment, and eight image texture features representing the spatial distribution of color intensities as inputs to the model. We compared several machine learning-based approaches, including gradient boosted decision trees and deep neural networks. Our results highlight the utility of UAS-captured data from diverse experimental plots to inform accurate rice yield prediction at commercial scales.