Skip to main content
ARS Home » Plains Area » Clay Center, Nebraska » U.S. Meat Animal Research Center » Genetics and Animal Breeding » Research » Publications at this Location » Publication #415280

Research Project: Multi-Dimension Phenotyping to Enhance Prediction of Performance in Swine

Location: Genetics and Animal Breeding

Title: Classification of sow postures using convolutional neural network and depth images

Author
item RAHMAN, MD TOWFIQUR - University Of Nebraska
item BROWN-BRANDL, TAMI - University Of Nebraska
item Rohrer, Gary
item SHARMA, SUDHENDU - University Of Nebraska
item SHI, YEYIN - University Of Nebraska

Submitted to: American Society of Agricultural and Biological Engineers
Publication Type: Proceedings
Publication Acceptance Date: 5/3/2024
Publication Date: 7/28/2024
Citation: Rahman, Md T., Brown-Brandl, T.M., Rohrer, G.A., Sharma, S.R., Shi, Y. 2024. Classification of sow postures using convolutional neural network and depth images. In: Proceedings of the American Society of Agricultural and Biological Engineers International (ASABE), July 28-31, 2024, Anaheim, CA. Paper 2401533. https://doi.org/10.13031/aim.202401533.
DOI: https://doi.org/10.13031/aim.202401533

Interpretive Summary: Approximately 16% of piglets born die before weaning with much of the death losses incurred due to a sow laying on a piglet while she is changing postures. Studying how sows change postures, specifically from standing to lying positions, may provide insight into how producers can reduce these preweaning losses. This study used depth video images to develop a machine-learned model that can accurately determine the posture of a sow residing in a farrowing crate and monitor posture changes. Depth images do not rely on visible light, making their use in any lighting condition ideal. Several different types of machine-learning models were applied to the data and the best model achieved an accuracy of 95%. Prediction of the standing posture was nearly 100% accurate, whereas the kneeling posture was the most difficult to accurately predict. With the knowledge of how often a sow changes postures and the amount of time spent in each position, scientists can study those parameters and identify which traits may contribute to piglet death losses. This information will lead to development of methods to either remove sows with poor maternal ability or design farrowing stalls to mitigate piglet crushing.

Technical Abstract: The US swine industry reports an average preweaning mortality of approximately 16% where approximately 6% of them are attributed to piglets overlayed by sows. Detecting postural transitions and estimating sows’ time budgets for different postures are valuable information for breeders and engineering design of farrowing facilities to eventually reduce piglet death. Computer vision tools can help monitor changes in animal posture accurately and efficiently. To create a more robust system and eliminate varying lighting issues within a day including daytime/ nighttime differences, there is an advantage to using depth cameras over digital cameras. In this study, a computer vision system was used for continuous depth image acquisition in several farrowing crates. The images were captured by top down view Kinect v2 depth sensors in the crates at 10 frames per minute for 24 h. The captured depth images were converted into Jet colormap images. A total of 14277 images from six different sows from 18 different days were randomly selected and labeled into six posture categories (standing, kneeling, sitting, sternal lying, lying on the right and lying on the left). The Convolutional Neural Network (CNN) architectures, that is, Resnet-50, Inception v3 with ‘imagenet’ pre-trained weight, were used for model training and posture images were tested. The dataset was randomly split training (75%) and validation (roughly 25%) sets. For testing, another dataset with 2885 images obtained from six different sows (from 12 different days) was labelled. Among the models tested in the test dataset, the Inception v3 model outperformed all the models, resulting in 95% accuracy in predicting sow postures. We found an F1 score between 0.90 and 1.00 for all postures except the kneeling posture (F1=0.81) since this is a transition posture. This preliminary result indicates the potential use of transfer learning models for this specific task. This result also indicates that depth images are suitable for identifying the postures of sows. The outcome of this study will lead to the identification and generation of posture data in a commercial farm scale to study the behavioral differences of sows within different characteristics of farm facilities, health status, mortality rates, and overall production parameters.