Mostrar el registro sencillo del ítem

dc.contributor.author
Mesz, Bruno  
dc.contributor.author
Sigman, Mariano  
dc.contributor.author
Trevisan, Marcos Alberto  
dc.date.available
2025-08-13T11:55:53Z  
dc.date.issued
2012-04  
dc.identifier.citation
Mesz, Bruno; Sigman, Mariano; Trevisan, Marcos Alberto; A composition algorithm based on crossmodal taste-music correspondences; Frontiers Media; Frontiers In Human Neuroscience; 6; 4-2012; 1-6  
dc.identifier.issn
1662-5161  
dc.identifier.uri
http://hdl.handle.net/11336/268854  
dc.description.abstract
While there is broad consensus about the structural similarities between language and music, comparably less attention has been devoted to semantic correspondences between these two ubiquitous manifestations of human culture. We have investigated the relations between music and a narrow and bounded domain of semantics: the words and concepts referring to taste sensations. In a recent work, we found that taste words were consistently mapped to musical parameters. Bitter is associated with low-pitched and continuous music (legato), salty is characterized by silences between notes (staccato),sour is high pitched, dissonant and fast and sweet is consonant, slow and soft (Mesz et al., 2011). Here we extended these ideas, in a synergistic dialog between music and science, investigating whether music can be algorithmically generated from taste-words. We developed and implemented an algorithm that exploits a large corpus of classic and popular songs. New musical pieces were produced by choosing fragments from the corpus and modifying them to minimize their distance to the region in musical space that characterizes each taste. In order to test the capability of the produced music to elicit significant associations with the different tastes, musical pieces were produced and judged by a group of non-musicians. Results showed that participants could decode well above chance the taste-word of the composition. We also discuss how our findings can be expressed in a performance bridging music and cognitive science.  
dc.format
application/pdf  
dc.language.iso
eng  
dc.publisher
Frontiers Media  
dc.rights
info:eu-repo/semantics/openAccess  
dc.rights.uri
https://creativecommons.org/licenses/by/2.5/ar/  
dc.subject
Music taste  
dc.subject
Musical algorithm  
dc.subject
Semantics  
dc.subject
Cross-modal associations  
dc.subject.classification
Otras Ciencias Naturales y Exactas  
dc.subject.classification
Otras Ciencias Naturales y Exactas  
dc.subject.classification
CIENCIAS NATURALES Y EXACTAS  
dc.title
A composition algorithm based on crossmodal taste-music correspondences  
dc.type
info:eu-repo/semantics/article  
dc.type
info:ar-repo/semantics/artículo  
dc.type
info:eu-repo/semantics/publishedVersion  
dc.date.updated
2025-08-12T12:06:55Z  
dc.journal.volume
6  
dc.journal.pagination
1-6  
dc.journal.pais
Suiza  
dc.journal.ciudad
Lausana  
dc.description.fil
Fil: Mesz, Bruno. Universidad Nacional de Quilmes. Departamento de Ciencia y Tecnología. Laboratorio de Acústica y Percepción Sonora; Argentina  
dc.description.fil
Fil: Sigman, Mariano. Consejo Nacional de Investigaciones Científicas y Técnicas; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Neurociencia Integrativa; Argentina  
dc.description.fil
Fil: Trevisan, Marcos Alberto. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Ciudad Universitaria. Instituto de Física de Buenos Aires. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Instituto de Física de Buenos Aires; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Física. Laboratorio de Sistemas Dinámicos; Argentina  
dc.journal.title
Frontiers In Human Neuroscience  
dc.relation.alternativeid
info:eu-repo/semantics/altIdentifier/url/https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2012.00071/full  
dc.relation.alternativeid
info:eu-repo/semantics/altIdentifier/doi/http://dx.doi.org/10.3389/fnhum.2012.00071