Skip to main content
ARS Home » Plains Area » Clay Center, Nebraska » U.S. Meat Animal Research Center » Genetics and Animal Breeding » Research » Publications at this Location » Publication #390001

Research Project: Identifying Genomic Solutions to Improve Efficiency of Swine Production

Location: Genetics and Animal Breeding

Title: Lameness detection in sows using few-shot approach

Author
item SHARMA, RAJ - University Of Nebraska
item BROWN-BRANDL, TAMI - University Of Nebraska
item Rohrer, Gary
item Rempel, Lea
item OSTRAND, LEXI - University Of Nebraska

Submitted to: European Conference on Precision Agriculture Proceedings
Publication Type: Proceedings
Publication Acceptance Date: 2/8/2022
Publication Date: N/A
Citation: N/A

Interpretive Summary:

Technical Abstract: Group housed sows have a higher occurrence of lameness. Continuous monitoring of lameness is necessary for high welfare standards. Some electronic sow feeders are designed with a corridor that holds the potential for continuous monitoring. The use of overhead 3D cameras (color + depth), along with state-of-the-art machine learning algorithms have shown promising results in gait analysis and detection of lameness in sows. However, sows don’t always walk through these corridors at a regular pace or with uniform directionality. This adds complexity to model development that depends on gait cycle analysis. Therefore, detecting lameness from a few images, instead of longer video streams, is important. In this study, the use of few-shot classification using top view depth images for lameness detection is explored. Few-shot classification is the machine learning technique of classifying and prediction based on limited training data. This study uses 2-way support set, lame vs non-lame and multiple shots 1,3,5 and 10 where each shot is a combination of multiple frames (1,5,10,15,30,60) of depth images derived from a 30 fps video streams. From a total of 1077 pigs, of which 34 had some level of lameness, a few shot classifier was modeled. The results show a worst-case accuracy of 33% at 1 shot with 1 frame. On the high end, an accuracy of 93% with a specificity (non lame) and sensitivity (lame) of 93% at 10 shots and 60 frames was achieved demonstrating the effectiveness of the few-shot approach.