Loading...
Please wait, while we are loading the content...
Similar Documents
Improved Transformer Net for Hyperspectral Image Classification
Content Provider | MDPI |
---|---|
Author | Qing, Yuhao Liu, Wenyi Feng, Liuyan Gao, Wanjia |
Copyright Year | 2021 |
Description | In recent years, deep learning has been successfully applied to hyperspectral image classification (HSI) problems, with several convolutional neural network (CNN) based models achieving an appealing classification performance. However, due to the multi-band nature and the data redundancy of the hyperspectral data, the CNN model underperforms in such a continuous data domain. Thus, in this article, we propose an end-to-end transformer model entitled SAT Net that is appropriate for HSI classification and relies on the self-attention mechanism. The proposed model uses the spectral attention mechanism and the self-attention mechanism to extract the spectral–spatial features of the HSI image, respectively. Initially, the original HSI data are remapped into multiple vectors containing a series of planar 2D patches after passing through the spectral attention module. On each vector, we perform linear transformation compression to obtain the sequence vector length. During this process, we add the position–coding vector and the learnable–embedding vector to manage capturing the continuous spectrum relationship in the HSI at a long distance. Then, we employ several multiple multi-head self-attention modules to extract the image features and complete the proposed network with a residual network structure to solve the gradient dispersion and over-fitting problems. Finally, we employ a multilayer perceptron for the HSI classification. We evaluate SAT Net on three publicly available hyperspectral datasets and challenge our classification performance against five current classification methods employing several metrics, i.e., overall and average classification accuracy and Kappa coefficient. Our trials demonstrate that SAT Net attains a competitive classification highlighting that a Self-Attention Transformer network and is appealing for HSI classification. |
Starting Page | 2216 |
e-ISSN | 20724292 |
DOI | 10.3390/rs13112216 |
Journal | Remote Sensing |
Issue Number | 11 |
Volume Number | 13 |
Language | English |
Publisher | MDPI |
Publisher Date | 2021-06-05 |
Access Restriction | Open |
Subject Keyword | Remote Sensing Imaging Science Deep Learning Hyperspectral Image (hsi) Classification Long-distance Dependence Self-attention Transformer |
Content Type | Text |
Resource Type | Article |