Skip to main content
ARS Home » Pacific West Area » Corvallis, Oregon » Horticultural Crops Disease and Pest Management Research Unit » Research » Research Project #445014

Research Project: Monitoring of Insects Using Deep Learning and Automated Systems

Location: Horticultural Crops Disease and Pest Management Research Unit

Project Number: 2072-22000-044-037-S
Project Type: Non-Assistance Cooperative Agreement

Start Date: Sep 1, 2023
End Date: Oct 31, 2025

Objective:
Develop an automated trap to detect female SWD through a) performing automated segmentation of SWD anatomical features using camera trap images and b) automatically identifying male and female SWD from image data using deep learning.

Approach:
Our first aim is to perform automated segmentation of raw images into relevant anatomical features (head, eyes, wings, abdomen, ovipositor) using deep learning. Implementing anatomical segmentation before species identification will help guide imaging and trap parameters for the best view of relevant body parts for classification. Furthermore, anatomical segmentation will give us a basis to select useful images for species identification (e.g. frames where a particular body part is visible) and aid in tracking individuals across multiple frames. We will segment anatomical features using DeepLabCut, a widely adopted open-source software package for animal pose estimation7. DeepLabCut allows users to generate a training set of marked anatomical points on animal images and then uses transfer learning -- refining an already established convolutional neural network – to generate a model to find those body markers on unlabeled images. We will collect our training set of images in a wind tunnel, using separate releases of SWD of both sexes and the co-occurring and closely resembling D. melanogaster. We will label key anatomical features (wings, eyes, ovipositor, abdomen) for all flies in ~400 randomly selected frames and then train the pose estimator on a subset of these frames using the ARS HPC for this computationally demanding training. We will evaluate the performance of the trained classifier based on the percentage of correctly marked body parts, false positives, and false negatives. We will similarly use a neural network classifier to perform automated identification of male and female SWD. Our first, proof-of-principle, step will train a classifier to identify SWD males from the wing image set of male/female SWD and male/female D. melanogaster collected in Aim 1. We will use the Inception V3 model (Google) as the image classifier, which is pre-trained with a standard dataset of millions of images and then ARS HPC resources for training. After implementing a classifier for SWD male recognition based on wing images, we will extend our approach to identifying SWD females based on a training set that includes labeled wings, abdomen, and ovipositor images. For each classifier, we will take the standard approach of using separate image sets for training and classification to avoid overfitting.