Skip to main content
ARS Home » Plains Area » College Station, Texas » Southern Plains Agricultural Research Center » Aerial Application Technology Research » Research » Publications at this Location » Publication #355791

Title: Registration for optical multimodal remote sensing images based on FAST detection, window selection and histogram specification

Author
item ZHAO, XIAOYANG - Huazhong Agricultural University
item ZHANG, JIAN - Huazhong Agricultural University
item Yang, Chenghai
item SONG, HUAIBO - Northwest Agricultural & Forestry University
item YEYIN, SHI - Texas A&M University
item XINGEN, ZHOU - Texas A&M Agrilife
item ZHANG, DONGYAN - Anhui Agricultural University
item ZHANG, GUOZHONG - Huazhong Agricultural University

Submitted to: Remote Sensing
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 9/12/2018
Publication Date: 11/10/2018
Citation: Zhao, X., Zhang, J., Yang, C., Song, H., Yeyin, S., Xingen, Z., Zhang, D., Zhang, G. 2018. Registration for optical multimodal remote sensing images based on FAST detection, window selection and histogram specification. Remote Sensing. 10(5):1-21. https://doi.org/10.3390/rs10050663.
DOI: https://doi.org/10.3390/rs10050663

Interpretive Summary: It is always a challenge to align or register images captured with different cameras or imaging sensor units. This study proposed a novel registration method based on multiple image processing techniques. Images acquired from different imaging systems mounted on manned and unmanned aircraft and ground-based platforms were used to evaluate the performance of the proposed method. Image analysis showed that this method resulted in more consistent image matching and smaller misalignment errors compared with several existing methods. The proposed method can be useful and effective for registering remote sensing images captured with different imaging sensors.

Technical Abstract: In recent years, digital frame cameras have been increasingly used for remote sensing applications. However, it is always a challenge to align or register images captured with different cameras or different imaging sensor units. In this research, a novel registration method was proposed. Coarse registration was first applied to approximately align the sensed and reference images. Window selection was then used to reduce the search space and a histogram specification was applied to optimize the grayscale similarity between the images. After comparisons with other commonly-used detectors, the fast corner detector, FAST (Features from Accelerated Segment Test), was selected to extract the feature points. The matching point pairs were then detected between the images, the outliers were eliminated, and geometric transformation was performed. The appropriate window size was searched and set to one-tenth of the image width. The images that were acquired by a two-camera system, a camera with five imaging sensors, and a camera with replaceable filters mounted on a manned aircraft, an unmanned aerial vehicle, and a ground-based platform, respectively, were used to evaluate the performance of the proposed method. The image analysis results showed that, through the appropriate window selection and histogram specification, the number of correctly matched point pairs had increased by 11.30 times, and that the correct matching rate had increased by 36%, compared with the results based on FAST alone. The root mean square error (RMSE) in the x and y directions was generally within 0.5 pixels. In comparison with the binary robust invariant scalable keypoints (BRISK), curvature scale space (CSS), Harris, speed up robust features (SURF), and commercial software ERDAS and ENVI, this method resulted in larger numbers of correct matching pairs and smaller, more consistent RMSE. Furthermore, it was not necessary to choose any tie control points manually before registration. The results from this study indicate that the proposed method can be effective for registering optical multimodal remote sensing images that have been captured with different imaging sensors.