Skip to main content
ARS Home » Southeast Area » Athens, Georgia » U.S. National Poultry Research Center » Bacterial Epidemiology & Antimicrobial Resistance Research » Research » Publications at this Location » Publication #381091

Research Project: Characterizing Antimicrobial Resistance in Poultry Production Environments

Location: Bacterial Epidemiology & Antimicrobial Resistance Research

Title: A machine vision-based method optimized for restoring broiler chicken images occluded by feeding and drinking equipment

Author
item GUO, YANGYANG - University Of Georgia
item AGGREY, SAMUEL - University Of Georgia
item Oladeinde, Adelumola - Ade
item JOHNSON, JASMINE - University Of Georgia
item ZOCK, GREOGORY - University Of Georgia
item CHAI, LILONG - University Of Georgia

Submitted to: Animals
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 1/6/2021
Publication Date: 1/8/2021
Citation: Guo, Y., Aggrey, S.E., Oladeinde, A.A., Johnson, J., Zock, G.S., Chai, L. 2021. A machine vision-based method optimized for restoring broiler chicken images occluded by feeding and drinking equipment. Animals. 11(1):123. https://doi.org/10.3390/ani11010123.
DOI: https://doi.org/10.3390/ani11010123

Interpretive Summary: We previously demonstrated that we could count and determine the location of a broiler chicken in broiler chicken houses using a machine vision-based method, however the accuracy of the method was limited by poultry house equipment occlusion. The equipment in the poultry house occluded the top view images of broiler chickens and limited the efficiency of target detection. In this study, we sought to improve the efficiency of this method by restoring broiler chicken images blocked by feeders and drinkers. To do this, we developed and tested linear and elliptical fitting filling methods under different occlusion scenarios to fill the occluded broiler chicken area. The filling method correctly restored the broiler chicken image >80% of the time. This study provides an approach to enhancing the quality of images collected using vision-based methods and it is a proof-of-concept for developing an automatic system to monitor flock health in a commercial production system.

Technical Abstract: The presence of equipment (e.g., water pipes, feed buckets, and other presence equipment, etc.) in the poultry house can occlude the areas of broiler chickens taken via top view. This can affect the analysis of chicken behaviors through vision-based machine learning imaging methods. In our previous study, we developed a machine vision-based method for monitoring broiler chicken floor distribution, and here we processed and restored the areas of broiler chickens which were occluded by the presence of an equipment. To verify the performance of the developed restoration method, top-view video of broiler chickens was recorded in two research broiler houses (240 birds equally raised in 12 pens per house). Firstly, target detection algorithm was used to initially detect the target areas in each image, and then Hough transform, and color features were used to further remove the occlusion equipment in the detection result. In poultry images, the broiler chicken occluded by equipment has either two areas (TA) or one area (OA). To reconstruct the occluded area of broiler chickens, the linear restoring method and the elliptical fitting restoring method were developed and tested. Three evaluation indices of the overlap rate (OR), false-positive rate (FPR), and false-negative rate (FNR) were used to evaluate the restoration method. From images collected on d2, d9, d16 and d23, about 100-sample images were selected for testing the proposed method. And then, around 80 high quality broiler areas detected were further evaluated for occlusion restoration. According to the results, the average value of OR, FPR and FNR for TA was 0.8150, 0.0032, and 0.1850, respectively. For OA, the average values of OR, FPR and FNR were 0.8788, 0.2227, and 0.1212, respectively. The study provides a new method for restoring occluded chicken areas that can hamper the success of vision-based machine predictions.