Cereb Cortex. 2024 Aug 1;34(8):bhae323. doi: 10.1093/cercor/bhae323.
ABSTRACT
Speech perception requires the binding of spatiotemporally disjoint auditory-visual cues. The corresponding brain network-level information processing can be characterized by two complementary mechanisms: functional segregation which refers to the localization of processing in either isolated or distributed modules across the brain, and integration which pertains to cooperation among relevant functional modules. Here, we demonstrate using functional magnetic resonance imaging recordings that subjective perceptual experience of multisensory speech stimuli, real and illusory, are represented in differential states of segregation-integration. We controlled the inter-subject variability of illusory/cross-modal perception parametrically, by introducing temporal lags in the incongruent auditory-visual articulations of speech sounds within the McGurk paradigm. The states of segregation-integration balance were captured using two alternative computational approaches. First, the module responsible for cross-modal binding of sensory signals defined as the perceptual binding network (PBN) was identified using standardized parametric statistical approaches and their temporal correlations with all other brain areas were computed. With increasing illusory perception, the majority of the nodes of PBN showed decreased cooperation with the rest of the brain, reflecting states of high segregation but reduced global integration. Second, using graph theoretic measures, the altered patterns of segregation-integration were cross-validated.
PMID:39110411 | DOI:10.1093/cercor/bhae323