Skip to main content
ARS Home » Plains Area » Temple, Texas » Grassland Soil and Water Research Laboratory » Research » Publications at this Location » Publication #398733

Research Project: Development of Enhanced Tools and Management Strategies to Support Sustainable Agricultural Systems and Water Quality

Location: Grassland Soil and Water Research Laboratory

Title: Peanut seed germination detection based on unmanned aerial systems (UAS) and deep learning

Author
item MA, SHENGFANG - Oklahoma State University
item ZHOU, YUTING - Oklahoma State University
item Flynn, Kyle
item AAKUR, SATHYA - Oklahoma State University

Submitted to: Meeting Abstract
Publication Type: Abstract Only
Publication Acceptance Date: 10/9/2022
Publication Date: 4/10/2023
Citation: Ma, S., Zhou, Y., Flynn, K.C., Aakur, S. 2023. Peanut seed germination detection based on unmanned aerial systems (UAS) and deep learning [abstract], Washington, DC, October 11-13, 2022.

Interpretive Summary: Peanut is an important economic oil crop around the world. Accurate and real-time detection of peanut seed germination is essential for peanut field management. Famers can evaluate the seed quality and replant in germination failure spots timely to avoid economic loss based on real-time peanut seed germination monitoring. However, traditional peanut seedlings’ germination monitoring is time-consuming and low efficiency, especially for large fields. To reduce time lags in detecting peanut germination failures, this study aims to combine the power of unmanned aerial systems (UAS) and object-based image analysis (OBIA) in identifying early in-field peanut germination. We tested two popular object detection models to identify peanut seedlings from five-band UAS imagery obtained through a multispectral camera setup. The performance of deep learning-based peanut seedlings detection is comparable to the accuracy of human monitoring, but with an improved inference speed. In contrast to common knowledge from remote sensing literature, the red-edge and near-infrared bands did not significantly improve the detection accuracy compared with the cheaper, conventional RGB imagery. These findings indicate that a remote sensing setup with a regular RGB camera can perform as well as a more expensive, multispectral camera system to detect peanut seedlings. With extensive experimentation, we infer that cheaper remote sensing mechanisms with the rapid acquisition of UAS-based imagery and the efficiency of OBIA methods is a practical way for early peanut germination detection.

Technical Abstract: Peanut is an important economic oil crop around the world. Accurate and real-time detection of peanut seed germination is essential for peanut field management. Famers can evaluate the seed quality and replant in germination failure spots timely to avoid economic loss based on real-time peanut seed germination monitoring. However, traditional peanut seedlings’ germination monitoring is time-consuming and low efficiency, especially for large fields. To reduce time lags in detecting peanut germination failures, this study aims to combine the power of unmanned aerial systems (UAS) and object-based image analysis (OBIA) in identifying early in-field peanut germination. We tested two popular object detection models, YOLOV3[1] and Faster RCNN[2], to identify peanut seedlings from five-band UAS imagery obtained through a multispectral camera setup (MicaSense Rededge). The mean average precision (mAP) of YOLOV3 training model is 0.56 while Faster RCNN is 0.76. The results show that Faster RCNN is more suitable for this task. We compared the effects of different backbones in the Faster RCNN model and found that change. However, the first three had similar accuracy, which indicates that ResNet-50 performs well in detecting peanut seedlings with lower computational overhead. Deeper models only marginally improved the performance but were more expensive in computation and inference times. The average precision (AP50) of the Faster RCNN is 0.88 based on RGB imagery when the threshold for Intersection over Union is set at 50% (IoU>=50). The AP50 of the detection result based on imagery from NIR, Red-edge, and red bands is 0.89, which is only slightly higher (0.01) than the performance of RGB imagery. The performance of deep learning-based peanut seedlings detection is comparable to the accuracy of human monitoring[3], but with an improved inference speed. In contrast to common knowledge from remote sensing literature [4], the red-edge and near-infrared bands did not significantly improve the detection accuracy compared with the cheaper, conventional RGB imagery. These findings indicate that a remote sensing setup with a regular RGB camera can perform as well as a more expensive, multispectral camera system to detect peanut seedlings. With extensive experimentation, we infer that cheaper remote sensing mechanisms with the rapid acquisition of UAS-based imagery and the efficiency of OBIA methods is a practical way for early peanut germination detection.