Skip to main content
ARS Home » Southeast Area » Stoneville, Mississippi » Crop Production Systems Research » Research » Research Project #444565

Research Project: Agri-Eye: Embedding Agricultural Intelligence in UAV-based Frameworks for Monitoring Crops

Location: Crop Production Systems Research

Project Number: 6066-22000-081-007-S
Project Type: Non-Assistance Cooperative Agreement

Start Date: Jun 26, 2023
End Date: Jun 25, 2028

Objective:
Unmanned Aerial Vehicles (UAV), a.k.a. drones, have been used for monitoring large agriculture fields to obtain images of target plant species and spray herbicides as required. However, cleaning up the obtained images and extracting features must be done using a central station that limits such monitoring frameworks from real-time sensing in spatiotemporal applications. Objective: To develop a low-cost Application-Specific Integrated Circuit (ASIC), Agri-eye, integrate it with unmanned aerial vehicles approved for federal government use, test the system for agricultural applications, including differentiating between common cover crops grown in Mississippi, distinguishing between row crops, and separating row crops from broadleaf weeds and grasses. The evaluation metrics will be based on how well the system identifies plants with different crop morphologies.

Approach:
Feature Identification of Crops and Selected Broadleaf and Grass Weeds (Year-1): As the first step in this research, we will evaluate two types of drones that can be integrated with low cost cameras to run embedded machine learning algorithms. PI from UT Tyler will work with USDA ARS to evaluate the Blue UAS-cleared drones, such as the IF750, and lighter drones, like RYZE Tello Edu Drones. These drones will collect images from the field using the camera on the drones and integrate the low-cost cameras and sensors as attachments/fixtures to help evaluate trained models for real-time surveillance. We will identify unique morphological and environmental features such as temperature, humidity, weather, color, and dimensions (height, width, and size) for each crop and select weeds in the targeted field(s) with the help of imagery acquired by the drones. Feature identification will be achieved through year-round monitoring and collection of images, along with integrated sensors, which can help automate identifying the crop type and weeds. Developing the agri-eye for the drones (Year-2): This research aims to develop a workable camera chip with an on-device learning capacity for differentiating in-field plant species using metadata collected from the integrated sensors. The proposed ASIC will contain the sensors required to identify the features for sensing windows. The development phase will include the ASIC design along with a user interface, such as a mobile app/web interface that can function as a middleman between the existing drone interfaces and the proposed ASIC. This will help in developing a sustainable ecosystem for the proposed chip to be integrated in any framework.