Results 1 - 10
of
20
Mini-Symposium Towards a New Neurobiology of Language
"... Theoretical advances in language research and the availability of increasingly high-resolution experimental techniques in the cognitive neurosciences are profoundly changing how we investigate and conceive of the neural basis of speech and language processing. Recent work closely aligns language res ..."
Abstract
-
Cited by 9 (0 self)
- Add to MetaCart
Theoretical advances in language research and the availability of increasingly high-resolution experimental techniques in the cognitive neurosciences are profoundly changing how we investigate and conceive of the neural basis of speech and language processing. Recent work closely aligns language research with issues at the core of systems neuroscience, ranging from neurophysiological and neuroanat-omic characterizations to questions about neural coding. Here we highlight, across different aspects of language processing (perception, production, sign language, meaning construction), new insights and approaches to the neurobiology of language, aiming to describe promising new areas of investigation in which the neurosciences intersect with linguistic research more closely than before. This paper summarizes in brief some of the issues that constitute the background for talks presented in a symposium at the Annual Meeting of the Society for Neuroscience. It is not a comprehensive review of any of the issues that are discussed in the symposium.
Phase-Locked Responses to Speech in Human Auditory Cortex are Enhanced During Comprehension
"... A growing body of evidence shows that ongoing oscillations in auditory cortex modulate their phase to match the rhythm of tempo-rally regular acoustic stimuli, increasing sensitivity to relevant envi-ronmental cues and improving detection accuracy. In the current study, we test the hypothesis that n ..."
Abstract
-
Cited by 6 (1 self)
- Add to MetaCart
A growing body of evidence shows that ongoing oscillations in auditory cortex modulate their phase to match the rhythm of tempo-rally regular acoustic stimuli, increasing sensitivity to relevant envi-ronmental cues and improving detection accuracy. In the current study, we test the hypothesis that nonsensory information provided by linguistic content enhances phase-locked responses to intelligible speech in the human brain. Sixteen adults listened to meaningful sen-tences while we recorded neural activity using magnetoencephalogra-phy. Stimuli were processed using a noise-vocoding technique to vary intelligibility while keeping the temporal acoustic envelope consistent. We show that the acoustic envelopes of sentences contain most power between 4 and 7 Hz and that it is in this frequency band that phase locking between neural activity and envelopes is strongest. Bilateral oscillatory neural activity phase-locked to unintelligible speech, but this cerebro-acoustic phase locking was enhanced when speech was intelligible. This enhanced phase locking was left lateralized and localized to left temporal cortex. Together, our results demonstrate that entrainment to connected speech does not only depend on acoustic characteristics, but is also affected by lis-teners ’ ability to extract linguistic information. This suggests a bio-logical framework for speech comprehension in which acoustic and linguistic cues reciprocally aid in stimulus prediction.
Review
"... Human immunodeficiency virus type 1 genetic diversity in the nervous system: Evolutionary epiphenomenon or disease determinant? ..."
Abstract
-
Cited by 4 (0 self)
- Add to MetaCart
(Show Context)
Human immunodeficiency virus type 1 genetic diversity in the nervous system: Evolutionary epiphenomenon or disease determinant?
Temporal Structure in Audiovisual Sensory Selection
, 2012
"... In natural environments, sensory information is embedded in temporally contiguous streams of events. This is typically the case when seeing and listening to a speaker or when engaged in scene analysis. In such contexts, two mechanisms are needed to single out and build a reliable representation of a ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
In natural environments, sensory information is embedded in temporally contiguous streams of events. This is typically the case when seeing and listening to a speaker or when engaged in scene analysis. In such contexts, two mechanisms are needed to single out and build a reliable representation of an event (or object): the temporal parsing of information and the selection of relevant information in the stream. It has previously been shown that rhythmic events naturally build temporal expectations that improve sensory processing at predictable points in time. Here, we asked to which extent temporal regularities can improve the detection and identification of events across sensory modalities. To do so, we used a dynamic visual conjunction search task accompanied by auditory cues synchronized or not with the color change of the target (horizontal or vertical bar). Sounds synchronized with the visual target improved search efficiency for temporal rates below 1.4 Hz but did not affect efficiency above that stimulation rate. Desynchronized auditory cues consistently impaired visual search below 3.3 Hz. Our results are interpreted in the context of the Dynamic Attending Theory: specifically, we suggest that a cognitive operation structures events in time irrespective of the sensory modality of input. Our results further support and specify recent neurophysiological findings by showing strong temporal selectivity for audiovisual integration in the
Task-dependent changes in cross-level coupling between single neurons and oscillatory activity in multiscale networks. PLoS Comput. Biol. 8:e1002809. doi: 10.1371/journal.pcbi.1002809
- Brain
, 2012
"... Understanding the principles governing the dynamic coordination of functional brain networks remains an important unmet goal within neuroscience. How do distributed ensembles of neurons transiently coordinate their activity across a variety of spatial and temporal scales? While a complete mechanisti ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
Understanding the principles governing the dynamic coordination of functional brain networks remains an important unmet goal within neuroscience. How do distributed ensembles of neurons transiently coordinate their activity across a variety of spatial and temporal scales? While a complete mechanistic account of this process remains elusive, evidence suggests that neuronal oscillations may play a key role in this process, with different rhythms influencing both local computation and long-range communication. To investigate this question, we recorded multiple single unit and local field potential (LFP) activity from microelectrode arrays implanted bilaterally in macaque motor areas. Monkeys performed a delayed center-out reach task either manually using their natural arm (Manual Control, MC) or under direct neural control through a brain-machine interface (Brain Control, BC). In accord with prior work, we found that the spiking activity of individual neurons is coupled to multiple aspects of the ongoing motor beta rhythm (10–45 Hz) during both MC and BC, with neurons exhibiting a diversity of coupling preferences. However, here we show that for identified single neurons, this beta-to-rate mapping can change in a reversible and task-dependent way. For example, as beta power increases, a given neuron may increase spiking during MC but decrease spiking during BC, or exhibit a reversible shift in the preferred phase of firing. The within-task stability of coupling, combined with the reversible cross-task changes in coupling, suggest that task-dependent changes in the beta-to-rate mapping play a role in the transient functional reorganization of neural
Differential Entrainment of Neuroelectric Delta Oscillations in Developmental Dyslexia
, 2013
"... Oscillatory entrainment to the speech signal is important for language processing, but has not yet been studied in developmental disorders of language. Developmental dyslexia, a difficulty in acquiring efficient reading skills linked to difficulties with phonology (the sound structure of language), ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
Oscillatory entrainment to the speech signal is important for language processing, but has not yet been studied in developmental disorders of language. Developmental dyslexia, a difficulty in acquiring efficient reading skills linked to difficulties with phonology (the sound structure of language), has been associated with behavioural entrainment deficits. It has been proposed that the phonological ‘deficit ’ that characterises dyslexia across languages is related to impaired auditory entrainment to speech at lower frequencies via neuroelectric oscillations (,10 Hz, ‘temporal sampling theory’). Impaired entrainment to temporal modulations at lower frequencies would affect the recovery of the prosodic and syllabic structure of speech. Here we investigated event-related oscillatory EEG activity and contingent negative variation (CNV) to auditory rhythmic tone streams delivered at frequencies within the delta band (2 Hz, 1.5 Hz), relevant to sampling stressed syllables in speech. Given prior behavioural entrainment findings at these rates, we predicted functionally atypical entrainment of delta oscillations in dyslexia. Participants performed a rhythmic expectancy task, detecting occasional white noise targets interspersed with tones occurring regularly at rates of 2 Hz or 1.5 Hz. Both groups showed significant entrainment of delta oscillations to the rhythmic stimulus stream, however the strength of inter-trial delta phase coherence (ITC, ‘phase locking’) and the CNV were both significantly weaker in dyslexics, suggestive of weaker entrainment and less preparatory brain activity. Both ITC strength and CNV amplitude were significantly related to individual differences in
No, There Is No 150 ms Lead of Visual Speech on Auditory Speech, but a Range of Audiovisual Asynchronies Varying from Small Audio Lead to Large Audio Lag
, 2013
"... An increasing number of neuroscience papers capitalize on the assumption published in this journal that visual speech would be typically 150 ms ahead of auditory speech. It happens that the estimation of audiovisual asynchrony in the reference paper is valid only in very specific cases, for isolated ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
An increasing number of neuroscience papers capitalize on the assumption published in this journal that visual speech would be typically 150 ms ahead of auditory speech. It happens that the estimation of audiovisual asynchrony in the reference paper is valid only in very specific cases, for isolated consonant-vowel syllables or at the beginning of a speech utterance, in what we call ‘‘preparatory gestures’’. However, when syllables are chained in sequences, as they are typically in most parts of a natural speech utterance, asynchrony should be defined in a different way. This is what we call ‘‘comodulatory gestures’ ’ providing auditory and visual events more or less in synchrony. We provide audiovisual data on sequences of plosive-vowel syllables (pa, ta, ka, ba, da, ga, ma, na) showing that audiovisual synchrony is actually rather precise, varying between 20 ms audio lead and 70 ms audio lag. We show how more complex speech material should result in a range typically varying between 40 ms audio lead and 200 ms audio lag, and we discuss how this natural coordination is reflected in the so-called temporal integration window for audiovisual speech perception. Finally we present a toy model of auditory and audiovisual predictive coding, showing that visual lead is actually not necessary for visual prediction.
Special issue: Review Prediction and constraint in audiovisual speech perception
"... Audiovisual speech Multisensory integration Predictive coding Predictive timing a b s t r a c t During face-to-face conversational speech listeners must efficiently process a rapid and complex stream of multisensory information. Visual speech can serve as a critical complement to auditory informati ..."
Abstract
- Add to MetaCart
Audiovisual speech Multisensory integration Predictive coding Predictive timing a b s t r a c t During face-to-face conversational speech listeners must efficiently process a rapid and complex stream of multisensory information. Visual speech can serve as a critical complement to auditory information because it provides cues to both the timing of the incoming acoustic signal (the amplitude envelope, influencing attention and perceptual sensitivity) and its content (place and manner of articulation, constraining lexical selection). Here we review behavioral and neurophysiological evidence regarding listeners' use of visual speech information. Multisensory integration of audiovisual speech cues improves recognition accuracy, particularly for speech in noise. Even when speech is intelligible based solely on auditory information, adding visual information may reduce the cognitive demands placed on listeners through increasing the precision of prediction. Electrophysiological studies demonstrate that oscillatory cortical entrainment to speech in auditory cortex is enhanced when visual speech is present, increasing sensitivity to important acoustic cues. Neuroimaging studies also suggest increased activity in auditory cortex when congruent visual information is available, but additionally emphasize the involvement of heteromodal regions of posterior superior temporal sulcus as playing a role in integrative processing. We interpret these findings in a framework of temporallyfocused lexical competition in which visual speech information affects auditory processing to increase sensitivity to acoustic information through an early integration mechanism, and a late integration stage that incorporates specific information about a speaker's articulators to constrain the number of possible candidates in a spoken utterance. Ultimately it is words compatible with both auditory and visual information that most strongly determine successful speech perception during everyday listening. Thus, audiovisual speech perception is accomplished through multiple stages of integration, supported by distinct neuroanatomical mechanisms.
Reviewed by:
, 2010
"... An object moving towards an observer is subjectively perceived as longer in duration than the same object that is static or moving away. This “time dilation effect ” has been shown for a number of stimuli that differ from standard events along different feature dimensions (e.g. color, size, and dyna ..."
Abstract
- Add to MetaCart
(Show Context)
An object moving towards an observer is subjectively perceived as longer in duration than the same object that is static or moving away. This “time dilation effect ” has been shown for a number of stimuli that differ from standard events along different feature dimensions (e.g. color, size, and dynamics). We performed an event-related functional magnetic resonance imaging (fMRI) study, while subjects viewed a stream of five visual events, all of which were static and of identical duration except the fourth one, which was a deviant target consisting of either a looming or a receding disc. The duration of the target was systematically varied and participants judged whether the target was shorter or longer than all other events. A time dilation effect was observed only for looming targets. Relative to the static standards, the looming as well as the receding targets induced increased activation of the anterior insula and anterior cingulate cortices (the “core control network”). The decisive contrast between looming and receding targets representing the time dilation effect showed strong asymmetric activation and, specifically, activation of cortical midline structures (the “default network”). These results provide the first evidence that the illusion of temporal dilation is due to activation of areas that are important for cognitive control and subjective awareness. The involvement of midline
Reviewed by: Peter Lakatos, Hungarian Academy of
, 2011
"... Linking speech perception and neurophysiology: speech decoding guided by cascaded oscillators locked to the input rhythm ..."
Abstract
- Add to MetaCart
Linking speech perception and neurophysiology: speech decoding guided by cascaded oscillators locked to the input rhythm