Skip to main content
ARS Home » Plains Area » Las Cruces, New Mexico » Range Management Research » Research » Publications at this Location » Publication #225741

Title: Incorporation of texture, intensity, hue, and saturation for rangeland monitoring with unmanned aircraft imagery

Author
item LALIBERTE, ANDREA - NEW MEXICO STATE UNIV
item Rango, Albert

Submitted to: Meeting Proceedings
Publication Type: Proceedings
Publication Acceptance Date: 3/25/2008
Publication Date: 8/6/2008
Citation: Laliberte, A., Rango, A. 2008. Incorporation of texture, intensity, hue, and saturation for rangeland monitoring with unmanned aircraft imagery. Proceedings of the International Archives of the Photogrammetery, Remote Sensing, and Spatial Information Sciences: GEOBIA 2008. Pixels, Objects, Intelligence. August 6-7, 2008 Calgary, AB. ISPRS, Vol. No. XXXVII-4/C1.

Interpretive Summary:

Technical Abstract: Aerial photography acquired with unmanned aerial vehicles (UAVs) has great potential for incorporation into rangeland health monitoring protocols, and object-based image analysis is well suited for this hyperspatial imagery. A major drawback, however, is the low spectral resolution of the imagery, because most lightweight cameras suitable for UAVs only acquire imagery in the red, green, and blue bands (RGB). Potential solutions include the incorporation of intensity, hue, and saturation (IHS) and/or texture measures. The use of texture had improved classification results in a related study, but we wanted to investigate whether IHS would yield similar results, because texture calculations in object-based analysis are computer intensive. Our objectives were to determine the optimal analysis scale and optimal band combinations: RGB, RGB+texture, RGB+IHS, or RGB+IHS+texture. Eight aerial photos were mosaicked and segmented at 15 consecutively coarser scale parameters (from 10 to 80) using the object-based image analysis program Definiens Professional. Class separation distances, classification accuracies and Kappa Index of Agreement were used to assess the classifications. The highest classification accuracies were achieved at segmentation scales between 50 and 70 and were consistently in the high 90% range, regardless of which bands were included. The inclusion of texture measures increased classification accuracies at nearly all segmentation scales, but the use of RGB+IHS alone resulted in comparable accuracies at most scales and with considerably less computation time. Techniques used in this study offer an objective approach for determining segmentation scale, and for selecting bands useful for rangeland mapping with hyperspatial, low spectral resolution imagery.