Skip to main content
ARS Home » Northeast Area » Beltsville, Maryland (BARC) » Beltsville Agricultural Research Center » Hydrology and Remote Sensing Laboratory » Research » Publications at this Location » Publication #312805

Title: Fusing Landsat and MODIS data for vegetation monitoring

Author
item Gao, Feng
item HILKER, THOMAS - Oregon State University
item ZHU, XIAOLIN - Colorado State University
item Anderson, Martha
item MASEK, JEFFERY - National Aeronautics And Space Administration (NASA)
item WANG, PEIJUAN - Collaborator
item Yang, Yun

Submitted to: IEEE Geoscience and Remote Sensing Magazine
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 5/7/2015
Publication Date: 10/5/2015
Publication URL: http://handle.nal.usda.gov/10113/61892
Citation: Gao, F.N., Hilker, T., Zhu, X., Anderson, M.C., Masek, J., Wang, P., Yang, Y. 2015. Fusing Landsat and MODIS data for vegetation monitoring. IEEE Geoscience and Remote Sensing Magazine. 3:47-60.

Interpretive Summary: Accurate spatiotemporal information about crop progress during the growing season is a fundamental key to crop yield estimation. Crop progress monitoring at field scale requires high resolution remote sensing data in both time and space. Remote sensing data from a single sensor cannot satisfy the requirement at present. Data fusion approach has been developed to fuse remote sensing imagery from Landsat and MODIS instruments. This paper summarizes three data fusion approaches (STARFM, STAARCH and ESTARFM) that have been widely used in various applications. The applications for forest disturbance mapping, crop condition monitoring and daily evapotranspiration mapping are demonstrated. Data fusion approach provides an effective method to generate high spatial and temporal resolution remote sensing data for crop growth and water use monitoring which is required by the National Agricultural Statistics Service and Foreign Agricultural Service for crop yield estimation.

Technical Abstract: Crop condition and natural vegetation monitoring require high resolution remote sensing imagery in both time and space. However, remote sensing data from a single sensor cannot satisfy the requirements at present. Remote sensing sensors were designed differently in terms of sensor characteristic, spatial resolution and acquisition frequency. For example, the Moderate-resolution Imaging Spectroradiometer (MODIS) provides daily observations globally at 250m to 1km spatial resolution. They are superior in capturing daily surface and atmosphere changes. However, MODIS imagery is too coarse to capture surface details for many applications. Landsat series satellites provide medium spatial resolution (30m) imagery which is great to capture surface details, but a long revisit cycle (16-day) has limited its use in describe the daily surface changes. Data fusion approach provides an alternative way to fuse data from multiple sensors so that the fused results can provide more values than individual sensor alone. In this paper, we present Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) and two extended data fusion models (STAARCH and ESTARFM) that were used to fuse MODIS and Landsat data. The fused MODIS-Landsat results inherit the spatial details of Landsat (30 m) and the temporal frequency of MODIS (daily).Theoretical basis of the approaches are described. Recent applications are discussed and presented. Although the data fusion approaches can produce MODIS-Landsat fused results, they are relied on the availability of actual satellite images and the quality of remote sensing products. These approaches are useful to bridge the gaps between medium resolution images but are not designed to replace actual satellite missions.