• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Effects of attention on the strength of lexical influences on speech perception: Behavioral experiments and computational mechanisms. (2008)

by D Mirman, J L McClelland, L L Holt, J S Magnuson
Venue:Cognitive Science,
Add To MetaCart

Tools

Sorted by:
Results 1 - 7 of 7

Speech perception as categorization

by Lori L. Holt, Andrew J. Lotto
"... Speech perception (SP) most commonly refers to the perceptual mapping from the highly variable acoustic speech signal to a linguistic representation, whether it be phonemes, diphones, syllables, or words. This is an example of categorization in that potentially discriminable speech sounds are assign ..."
Abstract - Cited by 10 (1 self) - Add to MetaCart
Speech perception (SP) most commonly refers to the perceptual mapping from the highly variable acoustic speech signal to a linguistic representation, whether it be phonemes, diphones, syllables, or words. This is an example of categorization in that potentially discriminable speech sounds are assigned to functionally equivalent classes. In this tutorial, we present some of the main challenges to our understanding of the categorization of speech sounds and the conceptualization of SP that has resulted from these challenges. We focus here on issues and experiments that define open research questions relevant to phoneme categorization, arguing that that SP is best understood as perceptual categorization, a position that places SP in direct contact with research from other areas of perception and cognition.

Mechanisms of Semantic Ambiguity Resolution: Insights from Speech Perception

by Daniel Mirman , 2008
"... The speech signal is inherently ambiguous and all computational and behavioral research on speech perception has implicitly or explicitly investigated the mechanism of resolution of this ambiguity. It is clear that context and prior proba-bility (i.e., frequency) play central roles in resolving ambi ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
The speech signal is inherently ambiguous and all computational and behavioral research on speech perception has implicitly or explicitly investigated the mechanism of resolution of this ambiguity. It is clear that context and prior proba-bility (i.e., frequency) play central roles in resolving ambiguities between possible speech sounds and spoken words (speech perception) as well as between meanings and senses of a word (semantic ambiguity resolution). However, the mechanisms of these effects are still under debate. Recent advances in understanding context and frequency effects in speech perception suggest promising approaches to investigating semantic ambiguity resolution. This review begins by motivating the use of insights from speech perception to understand the mechanisms of semantic ambiguity resolu-tion. Key to this motivation is the description of the structural similarity between the two domains with a focus on two parallel sets of findings: context strength effects, and an attractor dynamics account for the contrasting patterns of inhibition and facilitation due to ambiguity. The main part of the review then discusses three recent, influential sets of findings in speech perception, which suggest that (1) top-down contextual and bottom-up perceptual information interact to mutually constrain processing of ambi-guities, (2) word frequency influences on-line access, rather than response biases or resting levels, and (3) interactive integration of top-down and bottom-up information is optimal given the noisy, yet highly constrained nature of real-world communica-tion, despite the possible consequence of illusory perceptions. These findings and the empirical methods behind them provide auspicious future directions for the study of semantic ambiguity resolution.

Effect of Global Context on Homophone Ambiguity Resolution

by Daniel Mirman, James S. Magnuson, Ted J. Strauss, James A. Dixon
"... Effects of context are pervasive throughout perceptual and cognitive processing domains. Many studies have shown context effects in language processing, but these studies have mostly focused on local, linguistic contexts. As a step toward situating language processing in the broad scope of cognitive ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Effects of context are pervasive throughout perceptual and cognitive processing domains. Many studies have shown context effects in language processing, but these studies have mostly focused on local, linguistic contexts. As a step toward situating language processing in the broad scope of cognitive processing, we investigated the effect of a global, non-linguistic context on homophone ambiguity resolution. The context was implicitly induced by using only highly imageable target words. This context was predicted to shift attention away from non-imageable meanings, thus reducing activation of non-imageable meanings and consequently reducing ambiguity between meanings. We tracked eye movements as subjects heard spoken words and selected a matching picture from four displayed items. Results were consistent with the predictions: response times were faster for homophones with only one contextually appropriate meaning than homophones with two contextually appropriate meanings (reflecting reduced ambiguity) and participants were less likely to fixate semantic associates of contextually inappropriate meanings than contextually appropriate meanings (reflecting reduced activation of non-imageable meanings). These results suggest that global, non-linguistic contexts influence language processing by shifting attention away from contextually inappropriate meanings.
(Show Context)

Citation Context

...ontext can influenceslanguage processing is through attention. Previous work onsattention and language processing suggested that attentionsworks by damping activation of dis-attended representationss(=-=Mirman, McClelland, Holt, & Magnuson, 2008-=-). Followingsthis view of attention, we predict that global context willsdamp activation of the contextually inappropriate meaningssof ambiguous words. To test this prediction we used thesvisual world...

Language Learning and Development Representations for Phonotactic Learning in Infancy

by Kyle E Chambers , Kristine H Onishi , Cynthia Fisher
"... ..."
Abstract - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...les were words. In subsequent experiments, the percentages were 42% (Experiment 2: 16.5 mos), 45% (Experiment 2: 10.5 mos), 26% (Experiment 3), and 22% (Experiment 4). Since each test trial could contain both words and nonwords, we were unable to test for effects of lexical status. However, in previous experiments on adult phonotactic learning in perception (Chambers et al., 2010; Onishi et al., 2002), lexical status has not been found to influence phonotactic learning; this suggests that when the task does not require lexical access, lexical status does not strongly govern performance (e.g., Mirman, McClelland, Holt, & Magnuson, 2008). D ow nl oa de d by [ M cG ill U ni ve rs ity L ib ra ry ] at 0 8: 10 1 8 O ct ob er 2 01 1 292 CHAMBERS, ONISHI, AND FISHER during test. Across infants, each sublist, and therefore each syllable, appeared in every part of the design (i.e., familiarization, legal test, illegal test). The two test sublists were used to create eight test strings, four legal and four illegal, each comprising four unique syllables. Across the four syllables in each test string, all eight consonants and both vowels were presented, eliminating segment differences between the legal and illegal test trials. Syllable...

Running head: INTEGRATION OF PRAGMATIC AND PHONETIC CUES 1 Integration of Pragmatic and Phonetic Cues in Spoken Word Recognition

by Hannah Rohde, Marc Ettlinger
"... Goldrick for helpful discussion and research assistant Ronen Bay for his help during data collection. We also thank Randi Martin, Bob McMurray, and three anonymous reviewers for many helpful suggestions that led to substantial improvements over earlier drafts. Some portions of this work were first p ..."
Abstract - Add to MetaCart
Goldrick for helpful discussion and research assistant Ronen Bay for his help during data collection. We also thank Randi Martin, Bob McMurray, and three anonymous reviewers for many helpful suggestions that led to substantial improvements over earlier drafts. Some portions of this work were first presented in Rohde and Ettlinger (2010).

Edinburgh Research Explorer

by unknown authors
"... Integration of pragmatic and phonetic cues in spoken word recognition Citation for published version: Rohde, H & Ettlinger, M 2012, 'Integration of pragmatic and phonetic cues in spoken word recognition' ..."
Abstract - Add to MetaCart
Integration of pragmatic and phonetic cues in spoken word recognition Citation for published version: Rohde, H & Ettlinger, M 2012, 'Integration of pragmatic and phonetic cues in spoken word recognition'

Interactive Activation and Mutual Constraint Satisfaction in Perception and Cognition

by James L. Mcclell, A Daniel Mirman, B Donald J. Bolger, Pranav Khaitand , 2013
"... In a seminal 1977 article, Rumelhart argued that perception required the simultaneous use of multiple sources of information, allowing perceivers to optimally interpret sensory information at many levels of representation in real time as information arrives. Building on Rumelhart’s argu-ments, we pr ..."
Abstract - Add to MetaCart
In a seminal 1977 article, Rumelhart argued that perception required the simultaneous use of multiple sources of information, allowing perceivers to optimally interpret sensory information at many levels of representation in real time as information arrives. Building on Rumelhart’s argu-ments, we present the Interactive Activation hypothesis—the idea that the mechanism used in perception and comprehension to achieve these feats exploits an interactive activation process implemented through the bidirectional propagation of activation among simple processing units. We then examine the interactive activation model of letter and word perception and the TRACE model of speech perception, as early attempts to explore this hypothesis, and review the experi-mental evidence relevant to their assumptions and predictions. We consider how well these models address the computational challenge posed by the problem of perception, and we consider how consistent they are with evidence from behavioral experiments. We examine empirical and theo-retical controversies surrounding the idea of interactive processing, including a controversy that swirls around the relationship between interactive computation and optimal Bayesian inference. Some of the implementation details of early versions of interactive activation models caused devi-
(Show Context)

Citation Context

...k of trials in a perceptual experiment is relatively high. Specifically, if the non-word proportion is high, the speed advantage for recognition of phonemes in words compared to non-words is reduced (=-=Mirman, McClelland, Holt, & Magnuson, 2008-=-), the word bias in speech errors is reduced (Hartsuiker, Corley, & Martensen, 2005), the short-term memory advantage for words over non-words is reduced (Jefferies, Frankish, & Ralph, 2006), and ther...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University