Identificador persistente para citar o vincular este elemento: http://hdl.handle.net/10662/20332
Registro completo de Metadatos
Campo DCValoridioma
dc.contributor.authorPaoletti Ávila, Mercedes Eugenia-
dc.contributor.authorMoreno Álvarez, Sergio-
dc.contributor.authorHaut Hurtado, Juan Mario-
dc.date.accessioned2024-02-07T12:50:38Z-
dc.date.available2024-02-07T12:50:38Z-
dc.date.issued2021-
dc.identifier.issn0196-2892-
dc.identifier.urihttp://hdl.handle.net/10662/20332-
dc.description.abstractThe profound impact of deep learning and particularly of convolutional neural networks (CNNs) in automatic image processing has been decisive for the progress and evolution of remote sensing (RS) hyperspectral imaging (HSI) processing. Indeed, CNNs have stated themselves as the current state-ofart, reaching unparalleled results in HSI classification. However, most CNNs were designed for RGB images and their direct application to HSI data analysis could lead to nonoptimal solutions. Moreover, CNNs perform classification based on the identification of specific features, neglecting the spatialrelationships between different features (i.e., their arrangement) due to pooling techniques. The capsule network (CapsNet) architecture is an attempt to overcome this drawback by nesting several neural layers within a capsule, connected by dynamic routing, both to identify not only the presence of a feature, but also its instantiation parameters, and to learn the relationships between different features. Although this mechanism improves the data representations, enhancing the classification of HSI data, it still acts as a black box, without control of the most relevant features for classification purposes. Indeed, important features could be discriminated. In this paper, a new multiple attention guided CapsNet is proposed to improve feature processing for RSHSIs classification, both to improve computational efficiency (in terms of parameters) and to increase accuracy. Hence, the most representatives visual parts of the images are identified using a detailed feature extractor coupled with attention mechanisms. Extensive experimental results have been obtained on five real datasets, demonstrating the great potential of the proposed method compared to other state-of-the-art classifiers.en_Us
dc.description.sponsorshipThis work was supported by in part Junta de Extremadura FEDER under Grant GR18060 and Grant GR21040 and by in part by 2021 Leonardo Grant for Researchers and Cultural Creators, BBVA Foundation.-
dc.format.extent20 p.-
dc.format.mimetypeapplication/pdfen_US
dc.language.isoenges_ES
dc.publisherIEEE-
dc.rightsAtribución 4.0 Internacional-
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/-
dc.subjectRed de cápsulases_ES
dc.subjectRedes neuronales convolucionaleses_ES
dc.subjectReportajees_ES
dc.subjectHSIes_ES
dc.subjectAtenciónes_ES
dc.subjectFeatureen_Us
dc.subjectAttentionen_Us
dc.subjectCapsule network (CapsNet)en_Us
dc.subjectConvolutional neural networks (CNNs)en_Us
dc.titleMultiple attention-guided capsule networks for hyperspectral image classificationes_ES
dc.typearticlees_ES
dc.description.versionpeerReviewedes_ES
europeana.typeTEXTen_US
dc.rights.accessRightsopenAccesses_ES
dc.subject.unesco2490 Neurociencias-
dc.subject.unesco3304 Tecnología de Los Ordenadores-
europeana.dataProviderUniversidad de Extremadura. Españaes_ES
dc.identifier.bibliographicCitationM. E. Paoletti, S. Moreno-Álvarez and J. M. Haut, "Multiple Attention-Guided Capsule Networks for Hyperspectral Image Classification," in IEEE Transactions on Geoscience and Remote Sensing, vol. 60, pp. 1-20, 2022, Art no. 5520420, doi: 10.1109/TGRS.2021.3135506-
dc.type.versionpublishedVersion-
dc.contributor.affiliationUniversidad de Extremadura. Departamento de Tecnología de los Computadores y de las Comunicacioneses_ES
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/9650856-
dc.identifier.doi10.1109/TGRS.2021.3135506-
dc.identifier.publicationtitleIEEE Transactions on Geoscience and Remote Sensing-
dc.identifier.publicationfirstpage5520420-1-
dc.identifier.publicationlastpage5520420-20-
dc.identifier.publicationvolume60es_ES
dc.identifier.e-issn1558-0644-
dc.identifier.orcid0000-0003-1030-3729es_ES
dc.identifier.orcid0000-0001-6701-961Xes_ES
dc.identifier.orcid0000-0002-1858-9920es_ES
Colección:DIEEA - Artículos
DTCYC - Artículos

Archivos
Archivo Descripción TamañoFormato 
TGRS_2021_3135506.pdf6,34 MBAdobe PDFDescargar


Este elemento está sujeto a una licencia Licencia Creative Commons Creative Commons