Show simple item record

dc.contributor.author Phillips, Holly N.
dc.contributor.author Blenkmann, Alejandro Omar
dc.contributor.author Hughes, Laura E.
dc.contributor.author Kochen, Sara Silvia
dc.contributor.author Bekinschtein, Tristán Andrés
dc.contributor.author Cambridge Centre for Ageing and Neuroscience
dc.contributor.author Rowe, James B.
dc.date.available 2018-06-11T14:51:05Z
dc.date.issued 2016-09
dc.identifier.citation Phillips, Holly N.; Blenkmann, Alejandro Omar; Hughes, Laura E.; Kochen, Sara Silvia; Bekinschtein, Tristán Andrés; et al.; Convergent evidence for hierarchical prediction networks from human electrocorticography and magnetoencephalography; Elsevier Masson; Cortex; 82; 9-2016; 192-205
dc.identifier.issn 0010-9452
dc.identifier.uri http://hdl.handle.net/11336/48054
dc.description.abstract We propose that sensory inputs are processed in terms of optimised predictions and prediction error signals within hierarchical neurocognitive models. The combination of non-invasive brain imaging and generative network models has provided support for hierarchical frontotemporal interactions in oddball tasks, including recent identification of a temporal expectancy signal acting on prefrontal cortex. However, these studies are limited by the need to invert magnetoencephalographic or electroencephalographic sensor signals to localise activity from cortical ‘nodes’ in the network, or to infer neural responses from indirect measures such as the fMRI BOLD signal. To overcome this limitation, we examined frontotemporal interactions estimated from direct cortical recordings from two human participants with cortical electrode grids (electrocorticography – ECoG). Their frontotemporal network dynamics were compared to those identified by magnetoencephalography (MEG) in forty healthy adults. All participants performed the same auditory oddball task with standard tones interspersed with five deviant tone types. We normalised post-operative electrode locations to standardised anatomic space, to compare across modalities, and inverted the MEG to cortical sources using the estimated lead field from subject-specific head models. A mismatch negativity signal in frontal and temporal cortex was identified in all subjects. Generative models of the electrocorticographic and magnetoencephalographic data were separately compared using the free-energy estimate of the model evidence. Model comparison confirmed the same critical features of hierarchical frontotemporal networks in each patient as in the group-wise MEG analysis. These features included bilateral, feedforward and feedback frontotemporal modulated connectivity, in addition to an asymmetric expectancy driving input on left frontal cortex. The invasive ECoG provides an important step in construct validation of the use of neural generative models of MEG, which in turn enables generalisation to larger populations. Together, they give convergent evidence for the hierarchical interactions in frontotemporal networks for expectation and processing of sensory inputs.
dc.format application/pdf
dc.language.iso eng
dc.publisher Elsevier Masson
dc.rights info:eu-repo/semantics/restrictedAccess
dc.rights.uri https://creativecommons.org/licenses/by-nc-sa/2.5/ar/
dc.subject Dynamic causal modelling
dc.subject Mismatch negativity
dc.subject Electrocorticography
dc.subject Cognition
dc.subject.classification Salud Ocupacional
dc.subject.classification Ciencias de la Salud
dc.subject.classification CIENCIAS MÉDICAS Y DE LA SALUD
dc.title Convergent evidence for hierarchical prediction networks from human electrocorticography and magnetoencephalography
dc.type info:eu-repo/semantics/article
dc.type info:ar-repo/semantics/artículo
dc.type info:eu-repo/semantics/publishedVersion
dc.date.updated 2018-06-08T14:27:09Z
dc.journal.volume 82
dc.journal.pagination 192-205
dc.journal.pais Argentina
dc.journal.ciudad Paris
dc.description.fil Fil: Phillips, Holly N.. Cognition and Brain Sciences Unit; Reino Unido. University of Cambridge; Reino Unido
dc.description.fil Fil: Blenkmann, Alejandro Omar. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Instituto de Biología Celular y Neurociencia ; Argentina. Provincia de Buenos Aires. Ministerio de Salud. Hospital Alta Complejidad en Red El Cruce Dr. Néstor Carlos Kirchner Samic; Argentina. Gobierno de la Ciudad de Buenos Aires. Hospital General de Agudos ; Argentina
dc.description.fil Fil: Hughes, Laura E.. Cognition and Brain Sciences Unit; Reino Unido. University of Cambridge; Reino Unido
dc.description.fil Fil: Kochen, Sara Silvia. Gobierno de la Ciudad de Buenos Aires. Hospital General de Agudos ; Argentina. Provincia de Buenos Aires. Ministerio de Salud. Hospital Alta Complejidad en Red El Cruce Dr. Néstor Carlos Kirchner Samic; Argentina. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Instituto de Biología Celular y Neurociencia ; Argentina
dc.description.fil Fil: Bekinschtein, Tristán Andrés. University of Cambridge; Reino Unido. Cognition and Brain Sciences Unit; Reino Unido
dc.description.fil Fil: Cambridge Centre for Ageing and Neuroscience. No especifica;
dc.description.fil Fil: Rowe, James B.. Cognition and Brain Sciences Unit; Reino Unido. University of Cambridge; Reino Unido
dc.journal.title Cortex
dc.relation.alternativeid info:eu-repo/semantics/altIdentifier/doi/https://dx.doi.org/10.1016%2Fj.cortex.2016.05.001
dc.relation.alternativeid info:eu-repo/semantics/altIdentifier/url/https://www.sciencedirect.com/science/article/pii/S0010945216301058
dc.relation.alternativeid info:eu-repo/semantics/altIdentifier/url/https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4981429/
dc.conicet.fuente individual


Archivos asociados

Icon
Blocked Acceso no disponible

This item appears in the following Collection(s)

  • Articulos(ENYS) [12]
    Articulos de UNIDAD EJECUTORA DE ESTUDIOS EN NEUROCIENCIAS Y SISTEMAS COMPLEJOS

Show simple item record

info:eu-repo/semantics/restrictedAccess Excepto donde se diga explícitamente, este item se publica bajo la siguiente descripción: Creative Commons Attribution-NonCommercial-ShareAlike 2.5 Unported (CC BY-NC-SA 2.5)