Skip to main content
ARS Home » Plains Area » Clay Center, Nebraska » U.S. Meat Animal Research Center » Genetics and Animal Breeding » Research » Publications at this Location » Publication #415582

Research Project: Multi-Dimension Phenotyping to Enhance Prediction of Performance in Swine

Location: Genetics and Animal Breeding

Title: Robust piglet nursing behavior monitoring through multi-modal fusion of computer vision and ambient floor vibration

Author
item DONG, YIWEN - Stanford University
item SONG, ZIHAO - Stanford University
item CODLING, JESSE - University Of Michigan
item Rohrer, Gary
item Miles, Jeremy
item SHARMA, SUDHENDU RAJ - University Of Nebraska
item BROWN-BRANDL, TAMI - University Of Nebraska
item ZHANG, PEI - University Of Michigan
item NOH, HAE YOUNG - Stanford University

Submitted to: Computers and Electronics in Agriculture
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 9/24/2024
Publication Date: 8/11/2025
Citation: Dong, Y., Song, Z., Codling, J.R., Rohrer, G.A., Miles, J.R., Sharma, S., Brown-Brandl, T.M., Zhang, P., Noh, H. 2025. Robust piglet nursing behavior monitoring through multi-modal fusion of computer vision and ambient floor vibration. Computers and Electronics in Agriculture. 238. Article 110804. https://doi.org/10.1016/j.compag.2025.110804.
DOI: https://doi.org/10.1016/j.compag.2025.110804

Interpretive Summary: Piglet nursing is a critical activity that is associated with piglet survival and future performance. Continuous monitoring of piglet nursing behavior during the lactation period is essential to inform animal caretakers about piglet growth status and if intervention is necessary to reduce mortality or improve animal welfare. Traditional approaches rely on manual observation which is labor-intensive. Recent advancement in computer vision and structural vibration sensing enables non-contact monitoring of piglet nursing behavior. Computer vision can capture piglet location but has limited observation of piglet activity types; whereas structural vibration sensors can capture piglet movement but has limited location information. In this study, we integrate these two complementary technologies for reliable piglet nursing behavior monitoring for groups of piglets. Our method leverages the state-of-the-art computer vision software (Segment Anything Model; SAM) to extract the piglet location and combine it with the piglet-induced crate vibration data to predict nursing pattern and intensity of piglets, enabling accurate nursing monitoring under typical commercial conditions. We conducted real-world deployment at a pig farm for continuous vision and vibration monitoring of 8 pens over 3 farrowing cycles. The developed method had a 97% accuracy in classifying 5 different nursing activities, which was an 8% improvement compared to predictions based on only vision or vibration data. Producers will be able to use this model to predict nursing issues and allow timely resolution of those issues to prevent detrimental outcomes.

Technical Abstract: Nursing is a critical activity during the lactation period of swine farming. Continuous monitoring of piglet nursing behavior during the lactation period is essential to informing animal caretakers about the health status of piglets to reduce the mortality rate, maximize lactational growth, and improve animal welfare. Traditional approaches rely on manual observation and wearable devices, which are labor-intensive and can cause discomfort to the animals. Recent advancement in computer vision and ambient vibration sensing enables non-contact piglet nursing monitoring: The computer vision approach captures piglet location but has limited observation of their detailed movement due to lighting, resolution, and visual obstruction constraints; the ambient vibration sensing approach captures piglet movement patterns but has limited location information. In this study, a novel approach to integrate these two complementary sensing modalities is developed for robust piglet nursing behavior monitoring during the lactation period. This study leverages the state-of-the-art Segment Anything Model (SAM) to first convert images into sparse representations of piglet behaviors and then combine with ambient vibration to collaboratively infer piglet nursing pattern and intensity. This new approach enables piglet nursing monitoring with much lower computing and storage requirements than conventional computer vision methods, making it more practical for farm settings. Real-world experiments were conducted at a pig farm for continuous vision and vibration monitoring of 8 pens over 3 farrowing cycles. This study has a 97% accuracy in classifying 5 nursing stages, representing a significant 3x and 3.8x error reduction compared to the baseline method using only vision or vibration data, respectively. The multi-modal fusion approach leads to an efficient, robust, and accurate piglet nursing model that can immediately inform caretakers of issues that arise during this crucial time point of a piglet’s life.