Skip to main content
ARS Home » Northeast Area » Beltsville, Maryland (BARC) » Beltsville Agricultural Research Center » Environmental Microbial & Food Safety Laboratory » Research » Publications at this Location » Publication #404286

Research Project: Advancement of Sensing Technologies for Food Safety and Security Applications

Location: Environmental Microbial & Food Safety Laboratory

Title: Deep learning-based plant organ segmentation and phenotyping of sorghum plants using LiDAR point cloud

item PATEL, AJAY KUMAR - Chungnam National University
item PARK, EUN-SUNG - Chungnam National University
item LEE, HONGSEOK - Rural Development Administration - Korea
item PRIYA, G.G. LAKSHMI - Chungnam National University
item KIM, HANGI - Chungnam National University
item JOSHI, RAHUL - Chungnam National University
item ARIEF, MUHAMMAD AKBAR - Chungnam National University
item Kim, Moon
item Baek, Insuck
item CHO, BYOUNG-KWAN - Chungnam National University

Submitted to: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 7/2/2023
Publication Date: N/A
Citation: N/A

Interpretive Summary: Plant phenotyping with the use of big data plays an important role in current efforts by scientists and plant breeders to overcome threats of global climate change and population explosion. In low-rainfall regions, sorghum is a particularly important crop for providing sustenance to human populations. For high-throughput phenotyping of sorghum and other essential crops, LiDAR (light detection and ranging) sensors can be used to rapidly measure plant height and crown diameter compactness to monitor plant health. This study investigated the use of LiDAR 3D points and developed four deep-learning models to describe the physical attributes of greenhouse-grown sorghum plants, subsequently validated against manually measured attributes. The results demonstrated that the LiDAR technique can be useful for rapid and high-throughput phenotyping of sorghum and may also be an effective strategy for other plants, suggesting great potential for use of the method by plant producers and distributors for more effective and rapid plant phenotyping to maintain and increase crop yields under difficult environmental conditions to support human populations.

Technical Abstract: Increasing food demands, global climatic variations, and population growth have spurred the growth of crop yield driven by plant phenotyping in the age of big data. High-throughput phenotyping of sorghum at each plant and organ level is vital in molecular plant breeding to increase crop yield. LiDAR (light detection and ranging) sensor provides 3D point clouds of plants with the advantages of high precision, high resolution, and rapid measurement. However, there is a need to develop robust algorithms for extracting the phenotypic traits of sorghum plants using LiDAR 3D point cloud. This study utilized four 3D point cloud-based deep learning models named PointNet, PointNet++, PointCNN, and dynamic graph CNN (DGCNN) for the specific objective of the segmentation of sorghum plants. Subsequently, phenotypic traits were extracted using the segmentation results. Plant samples were grown under controlled conditions at various developmental stages. The extracted phenotypic traits outcome has been validated through the manually measured phenotypic traits of the sorghum plant. PointNet++ outperformed the other three deep learning models and provided the best segmentation result with a mean accuracy of 91.5%. The correlations of the six phenotypic traits, such as plant height, plant crown diameter, plant compactness, stem diameter, panicle length, and panicle width were calculated from the segmentation results of the PointNet++ model and the measured coefficient of determination (R^2) were 0.97, 0.96, 0.94, 0.90, 0.95, and 0.88, respectively. The obtained results showed that LiDAR 3D point clouds have good potential to measure the sorghum plant phenotype traits rapidly and accurately using deep learning techniques.