Skip to main content
ARS Home » Plains Area » College Station, Texas » Southern Plains Agricultural Research Center » Aerial Application Technology Research » Research » Publications at this Location » Publication #314826

Research Project: Aerial Application Technology for Sustainable Crop Production

Location: Aerial Application Technology Research

Title: Hyperspectral image classification for mapping agricultural tillage practices

Author
item RAN, QIONG - Beijing University Of Chemical Technology
item LI, WEI - Beijing University Of Chemical Technology
item DU, QIAN - Mississippi State University
item Yang, Chenghai

Submitted to: Journal of Applied Remote Sensing (JARS)
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 3/28/2015
Publication Date: 4/10/2015
Citation: Ran, Q., Li, W., Du, Q., Yang, C. 2015. Hyperspectral image classification for mapping agricultural tillage practices. Journal of Applied Remote Sensing (JARS). 9:097298.

Interpretive Summary: Current ground-based methods for mapping crop tillage practices are labor-intensive, time-consuming, and difficult to use for generating large-scale survey data. Remote sensing technology provides a more rapid, accurate, and objective solution. This study proposed an efficient classification framework for mapping agricultural tillage practices using hyperspectral imagery. Compared to previously used classification methods, the classification framework developed is data-independent and can achieve the balance between classification accuracy and computational complexity. Our results indicate that the framework has the potential to be implemented practically to provide timely surveying data for precision farm management.

Technical Abstract: An efficient classification framework for mapping agricultural tillage practice using hyperspectral remote sensing imagery is proposed, which has the potential to be implemented practically to provide rapid, accurate, and objective surveying data for precision agricultural management and appraisal from large-scale remote sensing images. It includes a local region filter (i.e., Gaussian low-pass filter (GLF)) to extract spatial/spectral features, a dimensionality reduction process (i.e., local fisher’s discriminate analysis (LFDA)), and the traditional k-nearest neighbor (KNN) classifier, which is denoted as GLF-LFDA-KNN. Compared to our previously used local average filter (LAF) and adaptive weighted filter (AWF), the GLF considers spatial features in a small neighborhood, but it emphasizes the central pixel itself and it is data-independent; therefore it can achieve the balance between classification accuracy and computational complexity. The KNN classifier has lower computational complexity, compared to the traditional support vector machine (SVM). After classification separability is enhanced by the GLF and LFDA, the less powerful KNN can outperform SVM and the overall computational cost stays lower. The proposed framework can also outperform the support vector machine with composite kernel (SVM-CK) that uses spatial-spectral features.