Identificador persistente para citar o vincular este elemento: http://hdl.handle.net/10662/20324
Registro completo de Metadatos
Campo DCValoridioma
dc.contributor.authorPaoletti Ávila, Mercedes Eugenia-
dc.contributor.authorHaut Hurtado, Juan Mario-
dc.contributor.authorFernández Beltrán, Rubén-
dc.contributor.authorPlaza Miguel, Javier-
dc.contributor.authorPlaza, Antonio-
dc.contributor.authorPla, Filiberto-
dc.date.accessioned2024-02-07T12:37:44Z-
dc.date.available2024-02-07T12:37:44Z-
dc.date.issued2018-
dc.identifier.issn0196-2892-
dc.identifier.urihttp://hdl.handle.net/10662/20324-
dc.description.abstractConvolutional neural networks (CNNs) have recently exhibited an excellent performance in hyperspectral image classification tasks. However, the straightforward CNN-based network architecture still finds obstacles when effectively exploiting the relationships between hyperspectral imaging (HSI) features in the spectral-spatial domain, which is a key factor to deal with the high level of complexity present in remotely sensed HSI data. Despite the fact that deeper architectures try to mitigate these limitations, they also find challenges with the convergence of the network parameters, which eventually limit the classification performance under highly demanding scenarios. In this paper, we propose a new CNN architecture based on spectral-spatial capsule networks in order to achieve a highly accurate classification of HSIs while significantly reducing the network design complexity. Specifically, based on Hinton's capsule networks, we develop a CNN model extension that redefines the concept of capsule units to become spectral-spatial units specialized in classifying remotely sensed HSI data. The proposed model is composed by several building blocks, called spectral-spatial capsules, which are able to learn HSI spectral-spatial features considering their corresponding spatial positions in the scene, their associated spectral signatures, and also their possible transformations. Our experiments, conducted using five well-known HSI data sets and several state-of-the-art classification methods, reveal that our HSI classification approach based on spectral-spatial capsules is able to provide competitive advantages in terms of both classification accuracy and computational time.en_Us
dc.format.extent17 p.es_ES
dc.format.mimetypeapplication/pdfen_US
dc.language.isoenges_ES
dc.publisherIEEE-
dc.rightsAtribución 4.0 Internacional-
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/-
dc.subjectImagen hiperespectrales_ES
dc.subjectRed neuronal convolucionales_ES
dc.subjectRedes de cápsulases_ES
dc.subjectHyperspectral imagingen_Us
dc.subjectConvolutional neural networksen_Us
dc.subjectCapsule networksen_Us
dc.titleCapsule networks for hyperspectral image classificationen_US
dc.typearticlees_ES
dc.description.versionpeerReviewedes_ES
europeana.typeTEXTen_US
dc.rights.accessRightsopenAccess-
dc.subject.unesco3304 Tecnología de Los Ordenadores-
europeana.dataProviderUniversidad de Extremadura. Españaes_ES
dc.identifier.bibliographicCitationM. E. Paoletti et al., "Capsule Networks for Hyperspectral Image Classification," in IEEE Transactions on Geoscience and Remote Sensing, vol. 57, no. 4, pp. 2145-2160, April 2019, doi: 10.1109/TGRS.2018.2871782-
dc.type.versionpublishedVersion-
dc.contributor.affiliationUniversidad de Extremadura. Departamento de Tecnología de los Computadores y de las Comunicaciones-
dc.contributor.affiliationUniversidad Jaume I-
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/8509610-
dc.identifier.publicationtitleIEEE Transactions on Geoscience and Remote Sensinges_ES
dc.identifier.publicationissue4-
dc.identifier.publicationfirstpage2145es_ES
dc.identifier.publicationlastpage2160es_ES
dc.identifier.publicationvolume57es_ES
dc.identifier.e-issn1558-0644-
dc.identifier.orcid0000-0003-1030-3729es_ES
dc.identifier.orcid0000-0001-6701-961X-
dc.identifier.orcid0000-0002-2384-9141-
dc.identifier.orcid0000-0002-9613-1659-
Colección:DTCYC - Artículos

Archivos
Archivo Descripción TamañoFormato 
TGRS_2018_2871782.pdf2,01 MBAdobe PDFDescargar


Este elemento está sujeto a una licencia Licencia Creative Commons Creative Commons