Skip to main content
ARS Home » Plains Area » Las Cruces, New Mexico » Range Management Research » Research » Publications at this Location » Publication #254920

Title: Unmanned aerial vehicles for hyperspatial remote sensing of rangelands: object-based classification and field validation

Author
item LALIBERTE, ANDREA - New Mexico State University
item Rango, Albert
item WINTERS, CRAIG - New Mexico State University
item Slaughter, Amalia - Amy
item Maxwell, Connie

Submitted to: American Society for Photogrammetry and Remote Sensing Proceedings
Publication Type: Abstract Only
Publication Acceptance Date: 12/11/2009
Publication Date: 4/30/2009
Citation: Laliberte, A.S., Rango, A., Winters, C., Slaughter, A.L., Maxwell, C.J. 2009. Unmanned aerial vehicles for hyperspatial remote sensing of rangelands: object-based classification and field validation [abstract]. American Society for Photogrammetry and Remote Sensing (ASPRS)Annual Meeting, April 26-30, 2010, San Diego, CA.

Interpretive Summary:

Technical Abstract: UAVs are ideally suited for monitoring and assessing vegetation conditions in remote rangelands due to the relatively low operating costs, ability for fast deployment, and greater flexibility than piloted aircraft. The likelihood of obtaining FAA permission for operating a UAV is also greater in remote and unpopulated rangeland areas. Researchers at the Jornada Experimental Range have been investigating rangeland remote sensing applications with UAVs for several years, are acquiring UAV imagery on a regular basis, and have developed reliable techniques for deriving hyperspatial UAV image mosaics. However, several challenges remain. While the spatial resolution of the imagery is high (6-8 cm GSD), the spectral and radiometric resolution is low due to the low-cost digital camera used on the UAV. Field validation is also complicated due to the relatively large error of conventional GPS-collected field data in relation to the very high spatial resolution imagery. In this study, we used object-based image analysis (OBIA) techniques and tested field data collections suitable for object-based classification of hyperspatial UAV imagery. The UAV image mosaic was segmented and classified into broad land cover classes, and the segmentation outlines were used in the field as a base for collecting training samples to fine-tune the classification to the species level. The training samples were used as input for a decision tree analysis, which incorporated spectral, spatial, contextual and texture features. The digital data collection allowed for an easy transfer into the image analysis software. Results indicate that geometric and classification accuracies are suitable for routine rangeland monitoring purposes, and that unmanned aircraft are a viable tool for obtaining very high resolution vegetation maps. The field data acquisition methods in conjunction with OBIA are also applicable to very high resolution digital aerial imagery acquired with piloted aircraft.