Identificador persistente para citar o vincular este elemento: http://hdl.handle.net/10662/20329
Títulos: Visual attention-driven hyperspectral image classification
Autores/as: Haut Hurtado, Juan Mario
Paoletti Ávila, Mercedes Eugenia
Plaza Miguel, Javier
Plaza, Antonio
Li, Jun
Palabras clave: Clasificación de imagen hiperespectral;Atención visual;Extracción de características;Aprendizaje profundo;Red neuronal residual;Hyperspectral image classification;Visual attention;Feature extraction;Deep learning;Residual neural networks
Fecha de publicación: 2019
Editor/a: IEEE
Resumen: Deep neural networks (DNNs), including convolutional (CNNs) and residual (ResNets) models, are able to learn abstract representations from the input data by considering a deep hierarchy of layers that performs advanced feature extraction. The combination of these models with visual attention techniques can assist with the identification of the most representative parts of the data from a visual standpoint, obtained through a more detailed filtering of the features extracted by the operational layers of the network. This is of significant interest for analyzing remotely sensed hyperspectral images (HSIs), characterized by their very high spectral dimensionality. However, few efforts have been conducted in the literature in order to adapt visual attention methods to remotely sensed HSI data analysis. In this paper, we introduce a new visual attention-driven technique for HIS classification. Specifically, we incorporate attention mechanisms to a ResNet in order to better characterize the spectral-spatial information contained in the data. Our newly proposed method calculates a mask that is applied on the features obtained by the network in order to identify the most desirable ones for classification purposes. Our experiments, conducted using four widely used HSI datasets, reveal that the proposed deep attention model provides competitive advantages in terms of classification accuracy when compared to other state-of-the-art methods.
URI: http://hdl.handle.net/10662/20329
ISSN: 0196-2892
DOI: 10.1109/TGRS.2019.2918080
Colección:DTCYC - Artículos

Archivos
Archivo Descripción TamañoFormato 
TGRS_2019_2918080.pdf2,36 MBAdobe PDFDescargar


Este elemento está sujeto a una licencia Licencia Creative Commons Creative Commons