Skip to main content
ARS Home » Southeast Area » Stuttgart, Arkansas » Dale Bumpers National Rice Research Center » Research » Publications at this Location » Publication #389809

Research Project: Gene Discovery and Crop Design for Current and New Rice Management Practices and Market Opportunities

Location: Dale Bumpers National Rice Research Center

Title: Rice yield prediction using unmanned aircraft systems

item HASHEM, AHMED - Arkansas State University
item MORRIS, MATTHEW - Morris, Inc
item McClung, Anna
item Huggins, Trevis
item FARAQ, FARED - Arkansas State University
item Reba, Michele
item BELLIS, EMILY - Arkansas State University
item Edwards, Jeremy

Submitted to: Meeting Abstract
Publication Type: Abstract Only
Publication Acceptance Date: 11/5/2021
Publication Date: 1/31/2022
Citation: Hashem, A.A., Morris, M., McClung, A.M., Huggins, T.D., Faraq, F., Reba, M.L., Bellis, E.S., Edwards, J. 2022. Rice yield prediction using unmanned aircraft systems. Abstract. 25th National Conservation Systems Cotton and Rice Conference, January 31 - February 2, 2022. Jonesboro, Arkansas.

Interpretive Summary:

Technical Abstract: Emerging sensor technologies continue to evolve and are essential to monitoring and predicting crop productivity. These sensors can generate various vegetation indices that are indicators of plant health and nutritional status. However, the accuracy of these indices can be affected by different rice varieties (genetics), soil variability, management practices, and environmental conditions. Imaging of small experimental plots can reveal how the relationship between vegetation indices and plant productivity changes due to these factors, but the extent to which information at plot scale translates to a whole field is poorly known. A field study was conducted at the USDA-ARS Dale Bumpers National Rice Research Center in Stuttgart, AR during 2021 to determine the relationship between observations of agronomic trait data with multi-spectral imaging. The study included 30 plots, each 4.9 m2, planted with the hybrid XP753. Plots were evaluated for: % stand, days to harvest, % lodging, yield, milling yield, and grain dimensions with plant height, days to heading, chlorophyll content, and leaf temperature being measured several times through the season on a subset of the plots. For comparison with the research plot study in Stuttgart, similar agronomic and multi-spectral imaging data was collected on a commercial field planted field with the hybrid RiceTec Fullpage 7321 near Carlisle, AR. Multispectral imagery was collected in the commercial field and in the research plots weekly. The imagery was used to monitor and predict crop productivity. An unmanned aircraft system (UAS) was equipped with the Altum multispectral and thermal sensor to collect five multispectral bands: Blue (475 nm), green (560 nm), red (668 nm), red edge (717 nm), near-IR (842 nm), and a radiometrically calibrated LWIR thermal infrared band (8-14um). Data collection occurred within two hours around noon. Prior to each flight, calibration images were collected with a calibrated reflectance panel and Downwelling Light Sensor (DLS) to produce radiometrically calibrated orthomosaics. All flights were conducted at 120 m above ground level, providing a ground sample distance of 5 cm per pixel (spatial resolution) per the spectral bands and 81 cm per pixel for the thermal band. Flight plans were developed with a 75% front/side overlap.Pix4D Mapper was used to stitch together and create a geo-tiff file. A total of 23 vegetation indices and thermal values were developed and extracted per plot per flight. A continuous dataset was created of 24 outputs per flight with twelve fly-overs captured from June 14 until September 14. Agronomic field data were collected across the season, approximately near the flight day/time. To evaluate the potential for spectral data from small experimental plots to inform yield prediction at field scale, we trained models on data from the 2021 research plots in Stuttgart, AR. We evaluated their ability to predict yield for the hybrid variety growing in a commercial field. In addition to 24 the index features, we also included rice genotype, growth stage, nitrogen treatment, and eight image texture features representing the spatial distribution of color intensities as inputs to the model. We compared several machine learning-based approaches, including gradient boosted decision trees and deep neural networks (DNN). Trained models exhibited good performance on a held-out set of small experimental plots (root mean square error: 0.70 ton/ha) and translated well to a commercial field of the same variety. Our results highlight the utility of UAS-captured data from diverse experimental plots to inform accurate rice yield prediction at commercial scales.