Location: Sugarbeet and Bean Research
2023 Annual Report
Objectives
1. Enable a new, efficient, and cost-effective robotic technology, coupled with automated infield sorting and quality tracking technologies, for commercial harvesting of apples.
2. Develop a new imaging technology, based on structured illumination integrated with artificial intelligence and advanced data analytics, with substantially improved capabilities for commercial quality inspection of fruits and vegetables.
Approach
Development of enabling technologies for automated fruit harvesting and nondestructive quality inspection during postharvest handling can provide an effective solution to the labor availability and cost issues, and enhance production efficiency, product quality, and thus profitability and sustainability for the specialty crop industries. In recent years, much research has been focused on fruit robotic harvesting, but the progress has been slow and unsatisfactory in meeting industry needs, mainly due to the several key technical hurdles encountered in robotic perception (identifying and localizing fruit), manipulation (reaching out for and picking fruit), and systems integration and coordination. While machine vision technology is widely used for postharvest quality inspection of horticultural products, it still is short of meeting industry expectations in detection of quality-degrading defects and symptoms. This research is therefore aimed at developing a new, cost effective robotic technology for automated harvesting of apples and a new generation imaging technology with substantially enhanced capabilities for quality inspection of fruits and vegetables (e.g., pickling cucumber and tomato) during postharvest handling. Innovative concepts and designs, coupled with artificial intelligence, will be used in the development of the new robotic harvesting system for fruit imaging, detection, localization, and picking. The new robotic system will be integrated with the recently developed apple harvest and infield sorting machine, to enable automated harvesting, sorting, grading and tracking of apples in the orchard. Moreover, a new imaging system, using our newly developed technique on improved reflectance for imaging structures, will be assembled to enable rapid, real-time inspection of harvested horticultural products for quality-degrading defects caused by bruising, physiological disorders, and disease infection. The new knowledge and technologies generated from the research will enable growers and packers/processors to achieve significant labor and cost savings in harvesting, enhance product marketability, and reduce postharvest product loss.
Progress Report
Objective 1: Detection of target fruit on trees is the first step in robotic harvesting of apples. Accurate and robust detection of apples on trees is challenging, due to complex orchard environments which involve varying light conditions, fruit clustering and foliage/branch occlusions. Apples in clusters often overlap with each other, which presents a significant challenge for their individual identification. A comprehensive dataset of color images was collected from two varieties of apples from commercial orchards under different natural lighting conditions (i.e., direct lighting, back lighting, and overcast) with varying degrees of apple occlusions. A novel deep learning-based apple detection algorithm using convolutional neural networks was developed for detection of apples in clustering and occlusion situations. The new algorithm was evaluated for its performance using the collected images, and it achieved 94% overall detection accuracy, which outperformed 12 other state-of-the-art deep learning models. A manuscript has been submitted to journal for publication consideration.
Accurate localization of apples on trees is a key step in robotic harvesting. Different techniques, including stereo vision, light detection and ranging (LiDAR) and time of flight imaging, have been used for fruit localization. However, these techniques have been found unsatisfactory when target fruits are occluded by leaves, and their performance also suffers under the varying natural lighting conditions in an orchard. Hence, a new, more robust and accurate perception system is needed for improved localization of apples. We have designed and built a novel perception system based on a laser triangulation principle. Calibration procedures were developed for the perception system, which achieved the maximum localization error of less than 4 mm and the average error of less than 1 mm. A patent application is being filed for the new perception system and a manuscript has been submitted to journal for publication consideration.
A new version of the apple harvesting robot was developed. This version of the harvesting robot is mainly composed of the new perception system, a four-degree-of-freedom manipulator, an improved soft end-effector, and a dropping/catching module to receive and transport harvested fruits. A new perception strategy integrating the new perception system and deep learning fruit detection algorithm, was able to achieve more than 90% apple detection accuracy and precise localization of target apples. The harvesting robot was evaluated in two apple orchards in 2022. In the orchard where trees were young and well pruned, the robot achieved a 82.4% successful harvesting rate, whereas in a second, older orchard with dense and cluttered branches and foliage, the robot had a 65.2% successful harvesting rate. The robot was also able to detach nearly 100% of the apples once they were gripped by the robot’s end effector. Overall, the harvesting robot has demonstrated superior performance, compared to the previous versions and other reported studies.
Objective 2: Structured-light imaging (SLI) is an emerging technique for enhanced detection of surface and subsurface defects of horticultural and food products, which may otherwise be difficult to detect using conventional imaging techniques. With the conventional SLI technique, three phase-shifted pattern images are needed, thus making it difficult to implement the technique for online inspection applications. A study was conducted to explore a faster methodology for identification of defects in horticultural products. Pattern images were acquired, using three phase-shifted illumination patterns, from orange fruits that were infected with penicillium digitatum fungus, the most serious and devastating pathogen for orange fruit. An efficient image demodulation algorithm developed by our team, was used to obtain alternative component (AC) images from one or two original pattern images. The obtained AC images were then processed using two brightness adjustment and correction techniques, coupled with a machine learning algorithm and image segmentation technique, to identify pathogen infected areas from the orange fruit samples. Three image processing strategies were proposed and evaluated; all of them achieved high identification rates of greater than 95%. It was found that accurate detection of early fungus-infected symptoms on the orange fruit, which were not visible at the surface, can be achieved by using one or two phase shifted pattern images. This finding provides a basis for real-time implementation of the SLI technique for quality inspection of horticultural products. One paper based on this study has been published in a peer-reviewed journal, with a second paper under review for journal publication.
Improvements to the current imaging system have been made for real-time implementation of the SLI technique for online quality inspection of horticultural products. A new sample conveying system was built to allow samples to be imaged by the SLI system at different conveying speeds, in synchronization with the digital light projector and high-speed camera. Preliminary tests on the digital light projector and imaging system have been carried out. Final integration of the software and hardware of the SLI system is being performed for real-time acquisition and processing of patten images for detection and classification of defective horticultural products.
Accomplishments
1. A new perception system for robotic harvesting of apples achieves superior performance. Robotic harvesting is urgently needed to alleviate the growing shortage and rising cost of labor for the apple and other specialty crop industries. Accurate localization of target fruits on trees is critical to robotic harvesting of apples. Currently, stereo vision and light detection and ranging (LiDAR) systems are commonly used for fruit localization, but their performance is unsatisfactory and susceptible to fruit occlusions by leaves and/or branches and natural light variation. ARS researchers in East Lansing, Michigan, in collaboration with Michigan State University, designed a new perception system for improved localization of apples. The new system has demonstrated superior performance with the maximum localization error of no more than 4 mm, and it still provides accurate localization information when target fruits are occluded by leaves. A new version of the robotic harvester integrated with the perception system was able to pick 82.4% of apples on trees in a high-density orchard. A patent application is being filed for the new perception system. This new system represents a significant step towards the development of a commercially viable robotic apple harvesting technology to help the U.S. apple industry reduce the reliance on manual labor and achieve long-term sustainability and profitability.
Review Publications
Pothula, A., Zhang, Z., Lu, R. 2023. Evaluation of a new apple in-field sorting system for fruit singulation, rotation and imaging. Computers and Electronics in Agriculture. 208. Article 107789. https://doi.org/10.1016/j.compag.2023.107789.
Li, J., Lu, Y., Lu, R. 2022. Detection of early decay in navel oranges by structured-illumination reflectance imaging combined with image enhancement and segmentation. Postharvest Biology and Technology. 196. Article 112162. https://doi.org/10.1016/j.postharvbio.2022.112162.
Lu, R., Dickinson, N., Lammers, K., Zhang, K., Chu, P., Li, Z. 2022. Design and evaluation of end effectors for a vacuum-based robotic apple harvester. Journal of the ASABE. 65(5):963-974. https://doi.org/10.13031/ja.14970.