Skip to main content
ARS Home » Plains Area » Clay Center, Nebraska » U.S. Meat Animal Research Center » Nutrition, Growth and Physiology » Research » Publications at this Location » Publication #319813

Title: Automatic recognition of lactating sow behaviors through depth image processing

Author
item LAO, FENGDAN - China Agricultural University
item Brown-Brandl, Tami
item STINN, JOHN - Iowa Select Farms
item LIU, K - Iowa State University
item TENG, GUANGHUI - China Agricultural University
item XIN, HONGWEI - Iowa State University

Submitted to: Computers and Electronics in Agriculture
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 4/26/2016
Publication Date: 5/6/2016
Citation: Lao, F., Brown-Brandl, T.M., Stinn, J.P., Liu, K., Teng, G., Xin, H. 2016. Automatic recognition of lactating sow behaviors through depth image processing. Computers and Electronics in Agriculture. 125:56-62.

Interpretive Summary: United States pork producers currently average an approximate 15% pre-weaning mortality rate. Many of these piglets are lost due to interaction with the sow. A modification of current sow crates may help reduce the number lost; however, an 'evaluation of sow and piglet behavior' in the current crates are needed. Behavior observations and posture changes of the sows can be used to assess farrowing crate design and heat lamp placement. A process was developed to automatically process and analyze 3D images of sow’s postures and behaviors in farrowing crates. This process is accurate in classifying the sow’s postures and behaviors. This data shows sows spend a much greater amount of time lying (84.0%) than sitting (4.1%) or standing (11.8%). The sows change position more as the pigs age.

Technical Abstract: Manual observation and classification of animal behaviors is laborious, time-consuming, and of limited ability to process large amount of data. A computer vision-based system was developed that automatically recognizes sow behaviors (lying, sitting, standing, kneeling, feeding, drinking, and shifting) in farrowing crate. The system consisted of a low-cost 3D camera that simultaneously acquires digital and depth images and a software program that detects and identifies the sow’s behaviors. This paper describes the computational algorithm for the analysis of depth images and presents its performance in recognizing the sow’s behaviors as compared to manual recognition. The images were acquired at 6 s intervals on three days of a 21-day lactation period. Based on analysis of the 6 s interval images, the algorithm had the following accuracy of behavioral recognition: 99.9% in lying, 96.4% in sitting, 99.2% in standing, 78.1% in kneeling, 97.4% in feeding, 92.7% in drinking, and 63.9% in transitioning between behaviors. The lower classification accuracy for the transitioning category presumably stemmed from insufficient frequency of the image acquisition which can be readily improved. Hence the reported system provides an effective way to automatically process and classify the sow’s behavioral images. This tool is conducive to investigating behavioral responses and time budget of lactating sows and their litters to farrowing crate designs and management practices.