Skip to main content
ARS Home » Northeast Area » Beltsville, Maryland (BARC) » Beltsville Agricultural Research Center » Hydrology and Remote Sensing Laboratory » Research » Publications at this Location » Publication #316310

Title: A Flexible Spatiotemporal Method for Fusing Satellite Images with Different Resolutions

Author
item ZHU, XIAOLIN - Collaborator
item LEFSKY, MICHAEL - Collaborator
item HELMER, EILEEN - Collaborator
item LIU, DESHENG - Collaborator
item CHEN, JIN - Collaborator
item Gao, Feng

Submitted to: Remote Sensing of Environment
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 11/13/2015
Publication Date: 11/21/2015
Publication URL: http://handle.nal.usda.gov/10113/61854
Citation: Zhu, X., Lefsky, M., Helmer, E., Liu, D., Chen, J., Gao, F.N. 2015. A flexible spatiotemporal method for fusing satellite images with different resolutions. Remote Sensing of Environment. 172:165-177.

Interpretive Summary: Vegetation and crop condition monitoring requires high resolution remote sensing images in both time and space. However, this requirement cannot currently be satisfied by any single Earth observing sensor. Data fusion approach has been developed to combine remote sensing images from multiple sensors. This paper presents a new flexible spatiotemporal data fusion method by integrating Moderate Resolution Imaging Spectroradiometer (MODIS) data products and Landsat imagery. The new data fusion model takes one pair of Landsat and MODIS image. Comparing to the original Spatial and Temporal Adaptive Reflectance Fusion Model, this approach is more accurate in capturing spatial details for highly heterogeneous regions. It also captures reflectance changes caused by land cover conversions. The new data fusion approach provides an alternative solution for integrating remote sensing data from different satellite sources for monitoring crop conditions at the field scale that is required by the National Agricultural Statistics Service and the Foreign Agricultural Service for more accurate yield assessments and predictions.

Technical Abstract: Studies of land surface dynamics in heterogeneous landscapes often require remote sensing data with high acquisition frequency and high spatial resolution. However, no single sensor meets this requirement. This study presents a new spatiotemporal data fusion method, the Flexible Spatiotemporal DAta Fusion (FSDAF) method, to generate synthesized frequent high spatial resolution images through blending two types of data, i.e., frequent coarse spatial resolution data, such as that from MODIS, and less frequent high spatial resolution data such as that from Landsat. The proposed method is based on spectral unmixing analysis and a thin plate spline interpolator. Compared with existing spatiotemporal data fusion methods, it has the following strengths: (1) it needs minimum input data; (2) it is suitable for heterogeneous landscapes; and (3) it can predict both gradual change and land cover change. Simulated data and real satellite images were used to test the performance of the proposed method. Its performance was compared with two very popular methods, the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) and an unmixing-based data fusion (UBDF) method. Results show that the new method creates more accurate fused images and keeps more spatial detail than STARFM and UBDF. More importantly, it closely captures reflectance changes caused by land cover conversions, which is a big issue with current spatiotemporal data fusion methods. Because the proposed method uses simple principles and needs only one fine-resolution image as input, it has the potential to increase the availability of high-resolution time-series data that can support studies of rapid land surface dynamics.