Location: Water Management and Systems Research
Title: Fusion of deep convolution and shallow features to recognize the severity of wheat fusarium head blightAuthor
ZHANG, DONYAN - Anhui Agricultural University | |
WANG, DAOYONG - Anhui Agricultural University | |
Zhang, Huihui | |
ZHANG, JIAN - Huazhong Agricultural University | |
LIANG, DONG - Anhui Agricultural University | |
GU, CHUNYAN - Anhui Academy Of Agricultural Sciences |
Submitted to: Frontiers in Plant Science
Publication Type: Peer Reviewed Journal Publication Acceptance Date: 11/30/2020 Publication Date: 1/21/2021 Citation: Zhang, D., Wang, D., Zhang, H., Zhang, J., Liang, D., Gu, C. 2021. Fusion of deep convolution and shallow features to recognize the severity of wheat fusarium head blight. Frontiers in Plant Science. 11. Article e599886. https://doi.org/10.3389/fpls.2020.599886. DOI: https://doi.org/10.3389/fpls.2020.599886 Interpretive Summary: A fast and non-destructive method for recognizing the severity of wheat Fusarium head blight (FHB) can effectively reduce fungicide use and associated costs in wheat production. This study proposed a method to recognize wheat FHB at different disease severity levels using high-resolution digital Red-Green-Blue (RGB) images. To test the robustness of the proposed method, RGB images were taken under different influence factors including light condition, camera shooting angle, image resolution, and crop growth period. All images were preprocessed to eliminate background noises. The AlexNet model was used to extract the deep convolution feature of wheat FHB and the color and texture features of wheat ears were extracted as shallow features. The Relief-F algorithm was used to fuse the deep convolution feature and shallow features as the final FHB features. Random forest algorithm was used for image classification. Results show that the recognition accuracy of the proposed fusion feature model was higher than those of models using other features in all conditions. The highest recognition accuracy of severity levels was obtained when images were taken under indoor condition, with high resolution (12 MB pixels), at 90° shooting angle during crop filling period. The proposed fusion method improved the ability of recognition of wheat FHB severity levels using RGB imaging. Technical Abstract: A fast and non-destructive method for recognizing the severity of wheat Fusarium head blight (FHB) can effectively reduce fungicide use and associated costs in wheat production. This study proposed a feature fusion method based on deep convolution feature and shallow features derived from high-resolution digital RGB images of wheat FHB at different disease severity levels. To test the robustness of the proposed method, RGB images were taken under different influence factors including light condition, camera shooting angle, image resolution, and crop growth period. All images were pre-processed to eliminate background noises to improve recognition accuracy. The AlexNet model parameters trained by the ImageNet 2012 dataset were transferred to the test dataset to extract the deep convolution feature of wheat FHB. Next, the color and texture features of wheat ears were extracted as shallow features. Then, the Relief-F algorithm was used to fuse the deep convolution feature and shallow features as the final FHB features. Finally, random forest was used to classify and identify the features of different FHB severity levels. Results show that the recognition accuracy of the proposed fusion feature model was higher than those of models using other features in all conditions. The highest recognition accuracy of severity levels was obtained when images were taken under indoor condition, with high resolution (12 MB pixels), at 90° shooting angle during crop filling period. The Relief-F algorithm assigned different weights to the features under different influence factor, it made the fused feature model more robust and improve the ability of recognition of wheat FHB severity levels using RGB images. |