|UAS for Remote Sensing|
Unmanned aircraft systems (UAS), or unmanned aerial vehicles (UAV) are widely used for military purposes, but civilian applications are a rapidly growing area for UAS due to their greater availability and the miniaturization of sensors, GPS, and associated hardware. Unmanned aircraft have been used to obtain remotely sensed imagery for fire and natural disaster monitoring, wildlife observations, and vegetation measurements in vineyards, crops, forests, and rangelands. The ability to deploy a UAS relatively quickly and repeatedly and at low altitudes allows for acquiring very high resolution remotely sensed data that can reflect changes in landscape processes and dynamics at very high resolutions.
At the Jornada Experimental Range (JER), ongoing research is aimed at determining the utility of UAS for rangeland mapping and monitoring. Over the last 5 years, we have developed a workflow for acquisition, processing, and analysis of fine detail UAS imagery, and for relating remotely sensed information to ground-based measurements. UAS imagery acquired at the JER is currently being used to adapt field sampling approaches to very high resolution imagery, derive parameters for hydrologic models, support repetitive data analysis for a phenology pilot study, assist in evaluation of disturbance experiments, and contribute high resolution information for archeological studies.
In terms of remote sensing, UAS are most similar to piloted aircraft used for acquisition of digital aerial imagery. There are advantages and limitations to using UAS for these purposes. UAS show great promise for remote sensing of rangelands due to the low flying heights, resulting high resolution imagery, and lower image acquisition costs per image compared to piloted aircraft. Current limitations include initial acquisition costs for the UAS, crew training requirements, limited availability of high quality and lightweight sensors, and FAA regulations for operating a UAS in the national airspace.
This website contains information about the aspects of our UAS remote sensing project, including operation of our unmanned aircraft in restricted and national airspace, processing of imagery into orthorectified image mosaics, and creation of geospatial products, such as vegetation classifications and terrain models.
Unmanned Aircraft & Sensors
We operate two MLB BAT 3 unmanned aircraft. The BAT system consists of a fully autonomous GPS-guided UAS with a 6-ft wingspan, a catapult launcher, ground station with mission planning and flight software, and telemetry system. The BAT is launched off a catapult and is capable of manual or autonomous landing. Click here for videos of launch and landing.
One of the BATs is configured for true color image acquisition, the other acquires multispectral and true color imagery simultaneously. Sensors include a color video camera with optical zoom capability in-flight and live video downlink to the ground station, a Canon SD 900 10-megapixel digital camera, and, for multispectral image acquisition, a Tetracam MCA multispectral camera. The multispectral camera acquires imagery in 6 narrow bands sensitive from the blue to near infrared range.
True color image acquisition
Multispectral + true color image acquisition
We generally fly at an altitude of 700 ft above ground, which results in a nominal ground resolved distance of 6 cm for the Canon imagery and 13 cm for the multispectral imagery. We acquire images with a forward lap of 75% and sidelap of 40% for photogrammetric processing into seamless image mosaics. The onboard computer records a timestamp, GPS location, elevation, pitch, roll, and heading for each image.
True color image, footprint 213 m x 160 m
Multispectral image, footprint 182 m x 145 m
Challenges associated with orthorectification of this type of imagery include the relatively small image footprints, considerable image distortion due to the use of a low-cost digital camera, difficulty in locating ground control points and in automatic tie point generation, and relatively large errors in exterior orientation parameters (X, Y, Z, roll, pitch, heading). We have developed a semi-automated orthorectification approach suitable for handling large numbers of individual small-footprint UAS imagery without the need for manual tie points and ground control points. Mosaics created with this process have a positional accuracy of approximately 1-2m for mosaics consisting of 100-300 images. To date, we have acquired over 30,000 UAS images, which have been processed into 100 image mosaics.
338 image mosaic, JER
Image mosaic over JER headquarters
257 image mosaic over Ace Tank, JER
Mosaic of Yarbrough dam
|Three image mosaics acquired in southwestern Idaho||Multispectral image mosaic, Ace Tank|
We use object-based image analysis (OBIA) (eCognition software) for image classifications. OBIA is more suitable than pixel-based classification for high and very high resolution imagery. The classification scheme is hierarchical and employs a rule-based masking approach, where an image is first classified into easy to define classes (bare/vegetated, shadow/non-shadow) using rules, followed by further classification to the species level with a nearest neighbor classification. Decision trees can be used to select suitable features for classification. We use texture features and hue, saturation, and intensity to compensate for the low spectral and radiometric resolution in the Canon imagery. Future classifications of the multispectral imagery will take advantage of the greater spectral and radiometric resolution.
In a study conducted in southwestern Idaho, we used OBIA to classify UAS imagery of 50 m x 50 m plots measured concurrently on the ground using standard rangeland monitoring procedures.
UAS images and classifications of six 50m x 50m plots
Correlations between image- and ground-based estimates of percent cover for bare ground, shrubs, and grass/forbs resulted in r-squared values ranging from 0.86 to 0.98. Time estimates indicated a greater efficiency for the image-based method. Classification accuracies for rangeland vegetation maps commonly range from 78% to 92%, depending on the number of classes.
UAS image mosaic (center), enlarged area of mosaic (left), classification of image (right)
Due to the relatively large file size of image mosaics, we develop the classification rule or process trees on small areas and transfer the rule base to the larger mosaic.