Mostrar el registro sencillo del ítem
dc.contributor.author
Tan, Ying
dc.contributor.author
Sun, Zhe
dc.contributor.author
Duan, Feng
dc.contributor.author
Solé Casals, Jordi
dc.contributor.author
Caiafa, César Federico

dc.date.available
2021-11-04T15:17:03Z
dc.date.issued
2021-09
dc.identifier.citation
Tan, Ying; Sun, Zhe; Duan, Feng; Solé Casals, Jordi; Caiafa, César Federico; A multimodal emotion recognition method based on facial expressions and electroencephalography; Elsevier; Biomedical Signal Processing and Control; 70; 9-2021; 103029, 1-11
dc.identifier.issn
1746-8094
dc.identifier.uri
http://hdl.handle.net/11336/146001
dc.description.abstract
Human-robot interaction (HRI) systems play a critical role in society. However, most HRI systems nowadays still face the challenge of disharmony, resulting in an inefficient communication between the human and the robot. In this paper, a multimodal emotion recognition method is proposed to establish an HRI system with a low sense of disharmony. This method is based on facial expressions and electroencephalography (EEG). The image classification method of facial expressions and the suitable feature extraction method of EEG were investigated based on the public datasets. And then these methods were applied to both images and EEG data acquired by ourselves. In addition, the Monte Carlo method was used to merge the results and solve the problem of having a small dataset. The multimodal emotion recognition method was combined with the HRI system, where it achieved a recognition rate of 83.33%. Furthermore, in order to evaluate the HRI system from the user´s point of view, a perceptual assessment method was proposed to evaluate our system, in which the system was scored by the participants based on their experience, achieving an average score of 7 (the scores were ranged from 0 to 10). Experimental results demonstrate the effectiveness and feasibility of the multimodal emotion recognition method, which can be useful to reduce the sense of disharmony of HRI systems.
dc.format
application/pdf
dc.language.iso
eng
dc.publisher
Elsevier

dc.rights
info:eu-repo/semantics/restrictedAccess
dc.rights.uri
https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
dc.subject
Human-robot-interaction
dc.subject
EEG
dc.subject
facial expression
dc.subject.classification
Ciencias de la Información y Bioinformática

dc.subject.classification
Ciencias de la Computación e Información

dc.subject.classification
CIENCIAS NATURALES Y EXACTAS

dc.title
A multimodal emotion recognition method based on facial expressions and electroencephalography
dc.type
info:eu-repo/semantics/article
dc.type
info:ar-repo/semantics/artículo
dc.type
info:eu-repo/semantics/publishedVersion
dc.date.updated
2021-11-04T13:18:20Z
dc.journal.volume
70
dc.journal.pagination
103029, 1-11
dc.journal.pais
Países Bajos

dc.description.fil
Fil: Tan, Ying. Nankai University; China
dc.description.fil
Fil: Sun, Zhe. Riken. Brain Science Institute; Japón
dc.description.fil
Fil: Duan, Feng. Nankai University; China
dc.description.fil
Fil: Solé Casals, Jordi. Central University of Catalonia; España
dc.description.fil
Fil: Caiafa, César Federico. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas. Instituto Argentino de Radioastronomía. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico Conicet - La Plata. Instituto Argentino de Radioastronomía; Argentina
dc.journal.title
Biomedical Signal Processing and Control

dc.relation.alternativeid
info:eu-repo/semantics/altIdentifier/url/https://linkinghub.elsevier.com/retrieve/pii/S1746809421006261
dc.relation.alternativeid
info:eu-repo/semantics/altIdentifier/doi/http://dx.doi.org/10.1016/j.bspc.2021.103029
Archivos asociados