Skip to main content
ARS Home » Midwest Area » Wooster, Ohio » Application Technology Research » Research » Publications at this Location » Publication #405091

Research Project: Coordinated Precision Application Technologies for Sustainable Pest Management and Crop Protection

Location: Application Technology Research

Title: Detection and infected area segmentation of apple fire blight using image processing and deep transfer learning for site-specific management

item MAHMUD, MD SULTAN - Tennessee State University
item HE, LONG - Pennsylvania State University
item ZAHID, AZLAN - Texas A&M University
item HEINEMANN, PAUL - Pennsylvania State University
item CHOI, DAEUN - University Of Florida
item KRAWCZYK, GRZEGORZ - Pennsylvania State University
item Zhu, Heping

Submitted to: Computers and Electronics in Agriculture
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 4/12/2023
Publication Date: 4/23/2023
Citation: Mahmud, M., He, L., Zahid, A., Heinemann, P., Choi, D., Krawczyk, G., Zhu, H. 2023. Detection and infected area segmentation of apple fire blight using image processing and deep transfer learning for site-specific management. Computers and Electronics in Agriculture. 209. Article #107862.

Interpretive Summary: Fire blight is a devastating bacterial disease that can seriously damage apple and other fruit trees. Accurate detection of symptoms can avoid a subsequent spread of the disease and reduce economic loss. However, automatic precision detection techniques for this disease are not available. In this research, a hybrid unmanned aerial vehicle (UAV) and ground system was investigated for rapid detection and segment of infected leaves in apple trees. An advanced deep learning model was applied to develop a fire blight disease detector with combined images acquired from both UAV and ground systems. The UAV-based imaging system provided a quick method to acquire large amounts of data from the orchard; however, lack of fire blight infection at the top of canopies offered fewer images with infection, resulting in insufficient data for development of prediction models. In comparison, the ground-based imaging system provided enough images of fire blight infections from the side lower branches. Consequently, the deep learning model detector provided the fire blight identification with accuracy up to 90% in complex and uncontrolled orchard environments. Thus, this hybrid system demonstrated a promising potential for future development of a precision disease detection and infected area segmentation technology to assist apple and tree fruit industries in timely and site-specific disease management.

Technical Abstract: Advanced sensing technologies and deep learning models are needed for automatic recognition of pathogens to protect trees in orchards. This study developed a fire blight disease detection and infected area segmentation system using image processing and deep learning approaches to automate the detection process in a complex apple orchard environment for site-specific management. Two types of images were acquired: multispectral images from an unmanned aerial vehicle (UAV) using a multispectral camera and red-green-blue (RGB) images from the ground using two different cameras. Multispectral images were preprocessed and used for image feature analysis by calculating vegetation indices, including excessive blue (ExB), normalized difference vegetation index (NDVI), green normalized difference vegetation index (GNDVI), red-edge normalized difference vegetation index (RENDVI), modified ratio vegetation index (RVI), and triangular blueness index (TBI). Vegetation indices were calculated from a total of 60 multispectral images (30 heathy and 30 fire blight infected). Results showed that RVI was most sensitive to fire blight infection among the six indices. A support vector machine model was used to classify unhealthy tree canopies. A Mask Region-Convolutional Neural Network (Mask R-CNN) based deep learning model was developed from RGB infected images. A total of 880 images were used for training, and 220 images were used for validation. Another 110 images were used for testing the trained Mask R-CNN model. A precision of 92.8 % and recall of 91.2 % were obtained by detecting the infected canopies using a ResNet-101 backbone and intersection over union (IoU) threshold of 0.7. The high precision demonstrates the effectiveness of Mask R-CNN for the identification and segmentation of fire blight infection in images taken in complex orchard conditions. These results prove the potential of this non-invasive sensing method in detecting disease in commercial fruit production for site-specific infected canopies removing.