Skip to main content
ARS Home » Northeast Area » Beltsville, Maryland (BARC) » Beltsville Agricultural Research Center » Sustainable Agricultural Systems Laboratory » Research » Publications at this Location » Publication #395861

Research Project: Precision Integrated Weed Management in Conventional and Organic Crop Production Systems

Location: Sustainable Agricultural Systems Laboratory

Title: Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton

item SAPKOTA, BISHWA - Texas A&M University
item POPESCU, SOREN - Texas A&M University
item RAJAN, NIJAN - Texas A&M University
item LEON, RAMON - North Carolina State University
item REBERG-HORTON, CHRIS - North Carolina State University
item Mirsky, Steven

Submitted to: Scientific Reports
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 10/31/2022
Publication Date: 11/15/2022
Citation: Sapkota, B., Popescu, S., Rajan, N., Leon, R., Reberg-Horton, C.S., Mirsky, S.B., Bagavathiannan, M. 2022. Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton. Scientific Reports. 12. Article number19580.

Interpretive Summary: There is increasing interest in being able to spot-treat weeds in the field rather than spraying the full field. Spot-treatment of weeds decreases the amount of herbicide that must be applied, which provides economic and environmental benefits. However, the ability to efficiently spot-treat weeds requires advanced technology that can detect weeds in the field, tell them apart from the cash crop, and estimate their biomass. Current methods of weed detection rely on training algorithms on large amounts of data. This paper discusses a more efficient means of training algorithms. The methodology outlined in this paper is useful to research scientists and brings US farmers one step closer to realizing the ability to spot-treat weeds in the field, decreasing the economic and environmental costs of herbicide use.

Technical Abstract: Site-specific treatment of weeds in agricultural landscapes has been gaining importance in recent years due to economic savings and minimal impact on the environment. Over the years, advanced precision systems have been developed for site-specific weed treatment; weed detection is usually the most crucial component of these systems. Different detection methods have been developed and tested, but recent developments in neural networks have offered great prospects. However, a major limitation with the neural network models is the requirement of high volumes of data for training. The current study aims at exploring an alternative approach to real images to address this issue. In this study, synthetic images were generated with various strategies using plant instances clipped from UAV-borne real images to train a powerful convolutional neural network (CNN) known as "Mask R-CNN" for weed detection and segmentation. The study was conducted on morningglories (MG) and grass weeds (Grass) infested in cotton. The biomass for individual weeds was also collected in the field for biomass modeling using detection and segmentation results. Results showed comparable performance between the synthetic and real images. Around 40-50 plant instances were sufficient for generating synthetic images that resulted in optimal performance. Row-orientation of cotton in the synthetic images was beneficial compared to random-orientation. Synthetic images generated with automatically-clipped plant instances performed similar to the ones generated with manually-clipped instances. Generative Adversarial Networks-derived fake plant instances-based synthetic images did not perform as effective as real plant instance-based images. The canopy mask area predicted weed biomass with R2 values of 0.66 and 0.46 for MG and Grass, respectively. The findings of this study offer valuable insights for guiding future endeavors oriented towards using synthetic images for weed detection and segmentaion, and biomass estimation in row crops.