Skip to main content
ARS Home » Plains Area » Sidney, Montana » Northern Plains Agricultural Research Laboratory » Pest Management Research » Research » Publications at this Location » Publication #170593

Title: QUANTIFICATION AND REDUCTION OF ERRONEOUS DIFFERENCES BETWEEN IMAGES IN REMOTE SENSING

Author
item Anderson, Gerald
item PELEG, KALMAN - TECHION ISRAEL INST./TECH

Submitted to: Environmental and Ecology
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 5/19/2007
Publication Date: 5/19/2007
Citation: Anderson, G.L., Peleg, K. 2007. Quantification and reduction of erroneous differences between images in remote sensing. Environmental and Ecology. 14(2):113-127.

Interpretive Summary: The reliability of image data and the difficulty in accurately comparing images acquired at different times or from different sensors is a generic problem in remote sensing. Electronic equipment is subject to several sources of error, all of which make it difficult to properly analyze the obtained images. This paper describes how some of the inherent errors can be either corrected or accounted for. High level mathematics, not typically used in remote sensing, are applied to the data to reduce error and choose the best image bands for analysis. It is important that those involved in developing statistically sound relationships between remotely sensed imagery and other data sources understand the problems and their solutions to prevent wasting time or developing relationships that are unstable, or lead to faulty conclusions.

Technical Abstract: The reliability of image data and the difficulty in accurately comparing images acquired at different times or from different sensors is a generic problem in remote sensing. Measurement repeatability errors occur frequently and can substantially reduce the systems ability to reliably quantify real spectral and spatial change in a target. This paper outlines methodologies for quantifying and reducing erroneous differences between monochrome, multi-spectral, or hyperspectral images. Specifically we will discuss the Pixel Block Transform (PBT), the Spectral Averaging Transform (SAT), the Wavelet Transform (WAVEL), and briefly discuss sensor fusion. Results indicate that the PBT is a powerful cross-noise and repeatability error reducing tool, applicable to monochrome, multi-spectral and hyperspectral images. The SAT is as powerful as the PBT in reducing error but it is only suitable for hyperspectral imagery. WAVEL can reduce some of the finer scale noise, but it is not as powerful in reducing cross-noise as PBT or SAT and it requires some trial and error for selecting the appropriate wavelet function. It is important that those involved in developing statistically sound relationships between remotely sensed imagery and other data sources understand the problems and their solutions to prevent wasting time or developing relationships that are statistically insignificant, unstable, or lead to faulty conclusions.