Skip to main content
ARS Home » Midwest Area » East Lansing, Michigan » Sugarbeet and Bean Research » Research » Publications at this Location » Publication #410546

Research Project: Automated Technologies for Harvesting and Quality Evaluation of Fruits and Vegetables

Location: Sugarbeet and Bean Research

Title: High-precision fruit localization using active laser-camera scanning: Robust laser line extraction for 2D-3D transformation

Author
item CHU, PENGYU - Michigan State University
item LI, ZHAOJIAN - Michigan State University
item ZHANG, KAIXIANG - Michigan State University
item LAMMERS, KYLE - Michigan State University
item Lu, Renfu

Submitted to: Smart Agricultural Technology
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 12/26/2023
Publication Date: 3/1/2024
Citation: Chu, P., Li, Z., Zhang, K., Lammers, K., Lu, R. 2024. High-precision fruit localization using active laser-camera scanning: Robust laser line extraction for 2D-3D transformation. Smart Agricultural Technology. 2024(7). Article 100391. https://doi.org/10.1016/j.atech.2023.100391.
DOI: https://doi.org/10.1016/j.atech.2023.100391

Interpretive Summary: Robotic harvesting is crucial to reducing the fruit industry’s reliance on manual labor and enhancing harvest productivity. Fruit localization, which determines the spatial position of target fruit on trees, is a critical step in robotic harvesting. Despite significant advances in object localization in recent years, existing techniques still cannot provide accurate fruit localization that is needed for robotic apple harvesting, due to fruit orientation/shape variations, varying light conditions and occlusions of apples by foliage and branches. In this research, we developed a novel fruit localization technique, called Active LAser-Camera Scanning (ALACS), for accurate and robust fruit localization. The ALACS hardware setup comprises a red line laser, a color or RGB camera, a linear motion slide, and a color-depth (or RGB-D) camera. A laser line extraction method was proposed for robust and high-precision feature matching on apples. ALACS has demonstrated superior abilities to extract precise patterns under variable lighting and occlusion conditions. It achieved average apple localization accuracies of 6.9-11.2 mm at distances ranging from 1.0 m to 1.6 m, compared to 21.5 mm by a commercial sensor, in an indoor experiment. The ALACS system has been incorporated with the harvesting robot developed by our team, and orchard evaluations demonstrated that ALACS has achieved a 95% fruit detachment rate versus a 71% rate by the commercial sensor. The new localization technique represents a significant contribution to the advancement of robotic fruit harvesting technology.

Technical Abstract: Recent advancements in deep learning-based approaches have led to remarkable progress in fruit detection, enabling robust fruit identification in complex environments. However, much less progress has been made on fruit 3D localization, which is equally crucial for robotic harvesting. Complex fruit shape/orientation, fruit clustering, varying lighting conditions, and occlusions by leaves and branches have greatly restricted existing sensors from achieving accurate fruit localization in the natural orchard environment. In this paper, we report on the design of a novel localization technique, called Active Laser-Camera Scanning (ALACS), to achieve accurate and robust fruit 3D localization. The ALACS hardware setup comprises a red line laser, an RGB color camera, a linear motion slide, and an external RGB-D camera. Leveraging the principles of dynamic-targeting laser-triangulation, ALACS enables precise transformation of the projected 2D laser line from the surface of apples to the 3D positions. To facilitate laser pattern acquisitions, a Laser Line Extraction (LLE) method is proposed for robust and high-precision feature extraction on apples. Comprehensive evaluations of LLE demonstrated its ability to extract precise patterns under variable lighting and occlusion conditions. The ALACS system achieved average apple localization accuracies of 6.9 - 11.2 mm at distances ranging from 1.0 m to 1.6 m, compared to 21.5 mm by a commercial RealSense RGB-D camera, in an indoor experiment. Orchard evaluations demonstrated that ALACS has achieved a 95% fruit detachment rate versus a 71% rate by the RealSense camera. By overcoming the challenges of apple 3D localization, this research contributes to the advancement of robotic fruit harvesting technology.