Skip to main content
ARS Home » Plains Area » Kerrville, Texas » Knipling-Bushland U.S. Livestock Insects Research Laboratory » Cattle Fever Tick Research Unit » Research » Publications at this Location » Publication #376529

Research Project: Integrated Pest Management of Cattle Fever Ticks

Location: Cattle Fever Tick Research Unit

Title: Automatic camera trap classification using wildlife-specific transfer learning in nilgai management

Author
item KUTUGATA, MATTHEW - UNIVERSITY OF TEXAS RIO GRANDE VALLEY
item Goolsby, John
item BAUMGARDT, JEREMY - TEXAS A&M UNIVERSITY
item RACELIS, ALEXIS - UNIVERSITY OF TEXAS RIO GRANDE VALLEY

Submitted to: Journal of Fish and Wildlife Management
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 7/21/2021
Publication Date: 7/23/2021
Citation: Kutugata, M.D., Goolsby, J., Baumgardt, J.A., Racelis, A. 2021. Automatic camera trap classification using wildlife-specific transfer learning in nilgai management. Journal of Fish and Wildlife Management. https://doi.org/10.3996/JFWM-20-076.
DOI: https://doi.org/10.3996/JFWM-20-076

Interpretive Summary: Nilgai antelope are implicated in the long-range movement of the southern cattle fever tick (SCFT), Rhipicephalus microplus, especially in Cameron and Willacy Counties in South Texas. Motion activated game cameras are used to track their movements through fence crossings. Manual processing and classifying of the hundreds of thousands of images taken by the cameras can take many months to complete. The computer program developed in this research, can process 5000 images per minute and determine if the image contains a nilgai. This program has the potential to increase the efficiency of wildlife studies, especially those involving nilgai.

Technical Abstract: 1. Camera traps provide a low-cost approach to collect data and monitor wildlife across large scales. Hand-labeling images at a rate that outpaces accumulation, however, becomes increasingly difficult. Various studies have shown that deep learning and convolutional neural networks (CNN) can automatically classify camera trap images with a high degree of accuracy. Unfortunately, these studies are difficult to replicate for conservation practitioners since they require large amounts of data and advanced knowledge in computer programming. 2. We provide a example of how CNNs can drastically reduce the number of labor hours used to hand-label camera trap images. 3. We trained a CNN to identify two groups, “Nilgai” a non-native game animal and “not nilgai”, with an overall accuracy of 97%. Our second model was trained to identify 21 classes with an overall accuracy of 89%.