Source:http://linkedlifedata.com/resource/pubmed/id/17067640
Switch to
Predicate | Object |
---|---|
rdf:type | |
lifeskim:mentions | |
pubmed:issue |
6
|
pubmed:dateCreated |
2007-1-15
|
pubmed:abstractText |
Using whole-head magnetoencephalography (MEG), audiovisual (AV) interactions during speech perception (/ta/- and /pa/-syllables) were investigated in 20 subjects. Congruent AV events served as the 'standards' of an oddball design. The deviants encompassed incongruent /ta/-/pa/ configurations differing from the standards either in the acoustic or the visual domain. As an auditory non-speech control condition, the same video signals were synchronized with either one of two complex tones. As in natural speech, visual movement onset preceded acoustic signals by about 150 ms. First, the impact of visual information on auditorily evoked fields to non-speech sounds was determined. Larger facial movements (/pa/ versus /ta/) yielded enhanced early responses such as the M100 component, indicating, most presumably, anticipatory pre-activation of auditory cortex by visual motion cues. As a second step of analysis, mismatch fields (MMF) were calculated. Acoustic deviants elicited a typical MMF, peaking ca. 180 ms after stimulus onset, whereas visual deviants gave rise to later responses (220 ms) of a more posterior-medial source location. Finally, a late (275 ms), left-lateralized visually-induced MMF component, resembling the acoustic mismatch response, emerged during the speech condition, presumably reflecting phonetic/linguistic operations. There is mounting functional imaging evidence for an early impact of visual information on auditory cortical regions during speech perception. The present study suggests at least two successive AV interactions in association with syllable recognition tasks: early activation of auditory areas depending upon visual motion cues and a later speech-specific left-lateralized response mediated, conceivably, by backward-projections from multisensory areas.
|
pubmed:language |
eng
|
pubmed:journal | |
pubmed:citationSubset |
IM
|
pubmed:status |
MEDLINE
|
pubmed:month |
Mar
|
pubmed:issn |
0028-3932
|
pubmed:author | |
pubmed:issnType |
Print
|
pubmed:day |
25
|
pubmed:volume |
45
|
pubmed:owner |
NLM
|
pubmed:authorsComplete |
Y
|
pubmed:pagination |
1342-54
|
pubmed:dateRevised |
2009-11-11
|
pubmed:meshHeading |
pubmed-meshheading:17067640-Adult,
pubmed-meshheading:17067640-Auditory Perception,
pubmed-meshheading:17067640-Cues,
pubmed-meshheading:17067640-Female,
pubmed-meshheading:17067640-Functional Laterality,
pubmed-meshheading:17067640-Humans,
pubmed-meshheading:17067640-Magnetoencephalography,
pubmed-meshheading:17067640-Male,
pubmed-meshheading:17067640-Motion Perception,
pubmed-meshheading:17067640-Speech Perception,
pubmed-meshheading:17067640-Visual Perception
|
pubmed:year |
2007
|
pubmed:articleTitle |
Sequential audiovisual interactions during speech perception: a whole-head MEG study.
|
pubmed:affiliation |
Department of General Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Germany. ingo.hertrich@uni-tuebingen.de
|
pubmed:publicationType |
Journal Article,
Research Support, Non-U.S. Gov't
|