Skip to main content
ARS Home » Northeast Area » Beltsville, Maryland (BARC) » Beltsville Agricultural Research Center » Hydrology and Remote Sensing Laboratory » Research » Publications at this Location » Publication #334882

Title: Spatial and temporal remote sensing data fusion for vegetation monitoring

Author
item Gao, Feng

Submitted to: Meeting Abstract
Publication Type: Abstract Only
Publication Acceptance Date: 10/8/2017
Publication Date: N/A
Citation: N/A

Interpretive Summary:

Technical Abstract: The suite of available remote sensing instruments varies widely in terms of sensor characteristics, spatial resolution and acquisition frequency. For example, the Moderate-resolution Imaging Spectroradiometer (MODIS) provides daily global observations at 250m to 1km spatial resolution. While imagery from coarse resolution sensors such as MODIS are typically superior to finer resolution data in terms of their revisit frequency, they lack spatial detail to capture surface features for many applications. The Landsat satellite series provides medium spatial resolution (30m) imagery which is well suited to capturing surface details, but a long revisit cycle (16-day) has limited its use in describing daily surface changes. Data fusion approaches provide an alternative way to utilize observations from multiple sensors so that the fused results can provide higher value than can an individual sensor alone. In this presentation, I will review data fusion models built based on the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) concept. Applications in vegetation phenology mapping and crop water use monitoring at 30-m spatial resolution will be presented and discussed. Limitations for satellite data fusion will be discussed.