Please use this identifier to cite or link to this item: http://hdl.handle.net/10662/20329
Title: Visual attention-driven hyperspectral image classification
Authors: Haut Hurtado, Juan Mario
Paoletti Ávila, Mercedes Eugenia
Plaza Miguel, Javier
Plaza, Antonio
Li, Jun
Keywords: Clasificación de imagen hiperespectral;Atención visual;Extracción de características;Aprendizaje profundo;Red neuronal residual;Hyperspectral image classification;Visual attention;Feature extraction;Deep learning;Residual neural networks
Issue Date: 2019
Publisher: IEEE
Abstract: Deep neural networks (DNNs), including convolutional (CNNs) and residual (ResNets) models, are able to learn abstract representations from the input data by considering a deep hierarchy of layers that performs advanced feature extraction. The combination of these models with visual attention techniques can assist with the identification of the most representative parts of the data from a visual standpoint, obtained through a more detailed filtering of the features extracted by the operational layers of the network. This is of significant interest for analyzing remotely sensed hyperspectral images (HSIs), characterized by their very high spectral dimensionality. However, few efforts have been conducted in the literature in order to adapt visual attention methods to remotely sensed HSI data analysis. In this paper, we introduce a new visual attention-driven technique for HIS classification. Specifically, we incorporate attention mechanisms to a ResNet in order to better characterize the spectral-spatial information contained in the data. Our newly proposed method calculates a mask that is applied on the features obtained by the network in order to identify the most desirable ones for classification purposes. Our experiments, conducted using four widely used HSI datasets, reveal that the proposed deep attention model provides competitive advantages in terms of classification accuracy when compared to other state-of-the-art methods.
URI: http://hdl.handle.net/10662/20329
ISSN: 0196-2892
DOI: 10.1109/TGRS.2019.2918080
Appears in Collections:DTCYC - Artículos

Files in This Item:
File Description SizeFormat 
TGRS_2019_2918080.pdf2,36 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons