Skip to main content
ARS Home » Northeast Area » Beltsville, Maryland (BARC) » Beltsville Agricultural Research Center » Hydrology and Remote Sensing Laboratory » Research » Publications at this Location » Publication #296726

Title: Operational data fusion framework for building frequent Landsat-like imagery in a cloudy region

Author
item WANG, P - Collaborator
item Gao, Feng
item MASEK, J - Goddard Space Flight Center

Submitted to: IEEE Transactions on Geoscience and Remote Sensing
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 8/6/2013
Publication Date: 4/1/2014
Publication URL: http://handle.nal.usda.gov/10113/59888
Citation: Wang, P., Gao, F.N., Masek, J. 2014. Operational data fusion framework for building frequent Landsat-like imagery in a cloudy region. IEEE Transactions on Geoscience and Remote Sensing. 52(11):7353-7365.

Interpretive Summary: Vegetation and crop condition monitoring requires high resolution remote sensing images in both time and space. However, it’s very expensive to acquire remotely sensed data with both high spatial resolution and frequent coverage. In this study, an operational data fusion framework is built to generate dense time-series Landsat-like images by integrating Moderate Resolution Imaging Spectroradiometer (MODIS) data products and Landsat imagery. The Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) has been integrated in the framework. Case studies were focused on monitoring vegetation and crop conditions in the Hindu Kush-Himalayan (HKH) region. In general, spatial and temporal variations of the landscape can be identified with a high level of detail from the fused data. The operational data fusion framework provides a feasible and cost effective solution for integrating remote sensing data from different satellite sources for monitoring crop conditions at the field scale required by the National Agricultural Statistics Service and the Foreign Agricultural Service for more accurate yield assessments and predictions.

Technical Abstract: An operational data fusion framework is built to generate dense time-series Landsat-like images for a cloudy region by fusing Moderate Resolution Imaging Spectroradiometer (MODIS) data products and Landsat imagery. The Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) is integrated in the framework. Compared to earlier implementations of STARFM, several improvements have been incorporated in the operational data fusion framework. These include viewing angular correction on the MODIS daily bidirectional reflectance, precise and automated co-registration on MODIS and Landsat pair images, and automatic selection of Landsat and MODIS pair date. Three tests that use MODIS and Landsat data pair from the same season of the same year, the same season of the different year, and the different season from an adjacent year have been performed over a Landsat scene using the integrated STARFM operational framework. The results show that the accuracy of the predicted results depends on the data consistencies between the MODIS Nadir Bidirectional Reflectance Distribution Function (BRDF) -Adjusted Reflectance (NBAR) and Landsat surface reflectance on both the pair date and prediction date. Case studies were focused on monitoring vegetation condition in central India and the Hindu Kush-Himalayan (HKH) region. In general, spatial and temporal variations of the landscape can be identified with a high level of detail from the fused data. Vegetation index trajectories derived from the fused products can be associated with specific land cover types that occur in the study regions. The operational data fusion framework provides a feasible and cost effective way to build dense time-series images at Landsat spatial resolution for the cloudy regions.