• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

The inductive learning of phonotactic patterns. (2007)

by Jerey Heinz Heinz
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 24
Next 10 →

Aural Pattern Recognition Experiments and the Subregular Hierarchy

by James Rogers, Geoffrey K. Pullum -
"... ..."
Abstract - Cited by 26 (4 self) - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...gnition. There is good reason to suspect that distinctions between human capabilities in this realm and the capabilities of our evolutionary cousins may occur within this range of complexities. Heinz =-=[10]-=-, for example, has shown that a wide range of stress patterns in human languages fall strictly within the Regular stringsets. We believe that experiments directed at distinguishing capabilities with r...

On languages piecewise testable in the strict sense

by James Rogers, Jeffrey Heinz, Gil Bailey, Matt Edlefsen, Molly Visscher, David Wellcome, Sean Wibel - In Proceedings of the 11th Meeting of the Assocation for Mathematics of Language , 2009
"... Abstract. In this paper we explore the class of Strictly Piecewise languages, originally introduced to characterize long-distance phonotactic patterns by Heinz [1] as the Precedence Languages. We provide a series of equivalent abstract characterizations, discuss their basic properties, locate them r ..."
Abstract - Cited by 18 (12 self) - Add to MetaCart
Abstract. In this paper we explore the class of Strictly Piecewise languages, originally introduced to characterize long-distance phonotactic patterns by Heinz [1] as the Precedence Languages. We provide a series of equivalent abstract characterizations, discuss their basic properties, locate them relative to other well-known subregular classes and provide algorithms for translating between the grammars defined here and finite state automata as well as an algorithm for deciding whether a regular language is Strictly Piecewise. 1
(Show Context)

Citation Context

...gnitive Science, University of Delaware Abstract. In this paper we explore the class of Strictly Piecewise languages, originally introduced to characterize long-distance phonotactic patterns by Heinz =-=[1]-=- as the Precedence Languages. We provide a series of equivalent abstract characterizations, discuss their basic properties, locate them relative to other well-known subregular classes and provide algo...

Improving Word Segmentation by Simultaneously Learning Phonotactics

by Daniel Blanchard, Jeffrey Heinz - CONLL 2008 , 2008
"... The most accurate unsupervised word segmentation systems that are currently available (Brent, 1999; Venkataraman, 2001; Goldwater, 2007) use a simple unigram model of phonotactics. While this simplifies some of the calculations, it overlooks cues that infant language acquisition researchers have sho ..."
Abstract - Cited by 9 (3 self) - Add to MetaCart
The most accurate unsupervised word segmentation systems that are currently available (Brent, 1999; Venkataraman, 2001; Goldwater, 2007) use a simple unigram model of phonotactics. While this simplifies some of the calculations, it overlooks cues that infant language acquisition researchers have shown to be useful for segmentation (Mattys et al., 1999; Mattys and Jusczyk, 2001). Here we explore the utility of using bigram and trigram phonotactic models by enhancing Brent’s (1999) MBDP-1 algorithm. The results show the improved MBDP-Phon model outperforms other unsupervised word segmentation systems (e.g., Brent, 1999; Venkataraman, 2001; Goldwater, 2007).
(Show Context)

Citation Context

...e reader to Goldwater (2007) for details. 4 In our experiments and those in Goldwater (2007), the segmenter runs through the corpus 1000 times before outputting the final segmentation. humbert, 1997; =-=Heinz, 2007-=-; Hayes and Wilson, 2008). While Hayes and Wilson present a more complex Maximum Entropy phonotactic model in their paper than the one we add to MBDP-1, they also evaluate a simple n-gram phonotactic ...

On the role of locality in learning stress patterns

by Jeffrey Heinz - Phonology , 2009
"... This paper presents a previously unnoticed universal property of stress patterns in the world's languages : they are, for small neighbourhoods, neighbourhooddistinct. Neighbourhood-distinctness is a locality condition defined in automatatheoretic terms. This universal is established by examini ..."
Abstract - Cited by 8 (4 self) - Add to MetaCart
This paper presents a previously unnoticed universal property of stress patterns in the world's languages : they are, for small neighbourhoods, neighbourhooddistinct. Neighbourhood-distinctness is a locality condition defined in automatatheoretic terms. This universal is established by examining stress patterns contained in two typological studies. Strikingly, many logically possible -but unattested -patterns do not have this property. Not only does neighbourhooddistinctness unite the attested patterns in a non-trivial way, it also naturally provides an inductive principle allowing learners to generalise from limited data. A learning algorithm is presented which generalises by failing to distinguish sameneighbourhood environments perceived in the learner's linguistic input -hence learning neighbourhood-distinct patterns -as well as almost every stress pattern in the typology. In this way, this work lends support to the idea that properties of the learner can explain certain properties of the attested typology, an idea not straightforwardly available in optimality-theoretic and Principle and Parameter frameworks.
(Show Context)

Citation Context

...ge is called the language’s tail canonical acceptor and typically regular patterns are represented with this acceptor. However, another algebraically equivalent choice is the head canonical acceptor (=-=Heinz 2007-=-). The head canonical acceptor is the smallest reverse deterministic acceptor recognising some pattern. An acceptor is reverse deterministic if and only if there is at most one final state, and for ev...

The Third Factor in Phonology

by Bridget Samuels - Biolinguistics , 2009
"... This article attempts to investigate how much of phonology can be explained by properties of general cognition and the Sensorimotor system — in other words, third-factor principles, in support of the evolutionary scenario posed by Hauser et al. (2002a). It argues against Pinker & Jackendoff’s (2 ..."
Abstract - Cited by 7 (0 self) - Add to MetaCart
This article attempts to investigate how much of phonology can be explained by properties of general cognition and the Sensorimotor system — in other words, third-factor principles, in support of the evolutionary scenario posed by Hauser et al. (2002a). It argues against Pinker & Jackendoff’s (2005: 212) claim that “major characteristics of phonology are specific to language (or to language & music), [and] uniquely human, ” and their conclusion that “phonology represents a major counterexample to the recursion-only hypothesis. ” Contrary to the statements by Anderson (2004) and Yip (2006a, 2006b) to the effect that phonology has not been tested in animals, it is shown that virtually all the abilities that underlie phonological competence have been shown in other species.

Deciding strictly local (SL) languages

by Matt Edlefsen, Dylan Leeman, Nathan Myers, Nathaniel Smith, Molly Visscher, David Wellcome - In J. Breitenbucher (Ed.), Proceedings of the , 2009
"... We have developed an efficient algorithm for determining if a Finite State Automaton de-scribes a Strictly Local (SL) stringset, the sim-plest class of the Sub-Regular Hierarchy, and to determine if that stringset is a subclass of SL that is learnable by an Inductive Inference Machine. We have used ..."
Abstract - Cited by 6 (0 self) - Add to MetaCart
We have developed an efficient algorithm for determining if a Finite State Automaton de-scribes a Strictly Local (SL) stringset, the sim-plest class of the Sub-Regular Hierarchy, and to determine if that stringset is a subclass of SL that is learnable by an Inductive Inference Machine. We have used this to categorize the phonotactic patterns in a catalog including es-sentially all of the currently attested patterns occurring in natural languages, most of which turn out to be learnable stringsets in this sim-plest class. 1
(Show Context)

Citation Context

...and figure out how to combine words and phrases (syntax)? Consider lexicon learning. It cannot be the case that humans master lexicons purely through memorization. As Morris Halle (as quoted in Heinz =-=[4]-=-) points ∗Address for all authors: Dept. of Computer Science, Earlham College, 801 National Road West, Richmond, Indiana 47374 out, English speakers can correctly recognize the English words in the li...

Three Correlates of the Typological Frequency of Quantity-Insensitive Stress Systems

by Max Bane, Jason Riggle
"... We examine the typology of quantityinsensitive (QI) stress systems and ask to what extent an existing optimality theoretic model of QI stress can predict the observed typological frequencies of stress patterns. We find three significant correlates of pattern attestation and frequency: the trigram en ..."
Abstract - Cited by 6 (1 self) - Add to MetaCart
We examine the typology of quantityinsensitive (QI) stress systems and ask to what extent an existing optimality theoretic model of QI stress can predict the observed typological frequencies of stress patterns. We find three significant correlates of pattern attestation and frequency: the trigram entropy of a pattern, the degree to which it is “confusable” with other patterns predicted by the model, and the number of constraint rankings that specify the pattern. 1

Cognitive and Sub-Regular Complexity

by James Rogers, Jeffrey Heinz, Margaret Fero, Jeremy Hurst, Sean Wibel
"... Abstract. We present a measure of cognitive complexity for subclasses of the regular languages that is based on model-theoretic complexity rather than on description length of particular classes of grammars or automata. Unlikedescriptionlengthapproaches,thiscomplexitymeasure is independent of the im ..."
Abstract - Cited by 3 (2 self) - Add to MetaCart
Abstract. We present a measure of cognitive complexity for subclasses of the regular languages that is based on model-theoretic complexity rather than on description length of particular classes of grammars or automata. Unlikedescriptionlengthapproaches,thiscomplexitymeasure is independent of the implementation details of the cognitive mechanism. Hence, it provides a basis for making inferences about cognitive mechanisms that are valid regardless of how those mechanisms are actually realized. 1
(Show Context)

Citation Context

...to illustrate the complexity hierarchy is because the cross-linguistic typology of stress patterns has been well-studied [8,9] and because finite-state representations of these patterns already exist =-=[10,11]-=-. In the next section, we explain why approaches based on minimum description length fail to provide an adequate notion of cognitive complexity in these domains. We then (Section 3) develop a model-th...

Learning left-to-right and right-to-left iterative languages

by Jeffrey Heinz
"... Abstract. The left-to-right and right-to-left iterative languages are previously unnoticed subclasses of the regular languages of infinite size that are identifiable in the limit from positive data. Essentially, these language classes are the ones obtained by merging final states in a prefix tree an ..."
Abstract - Cited by 3 (3 self) - Add to MetaCart
Abstract. The left-to-right and right-to-left iterative languages are previously unnoticed subclasses of the regular languages of infinite size that are identifiable in the limit from positive data. Essentially, these language classes are the ones obtained by merging final states in a prefix tree and initial states in a suffix tree of the observed sample, respectively. Strikingly, these classes are also transparently related to the zero-reversible languages because some algorithms that learn them differ minimally from the ZR algorithm given in Angluin (1982). Second, they are part of the answer to the challenge provided by Muggleton (1990), who proposed mapping the space of language classes obtainable by a general statemerging algorithm IM1. Third, these classes are relevant to a hypothesis of how children can acquire sound patterns of their language—in particular, the hypothesis that all phonotactic patterns found in natural language are neighborhood-distinct (Heinz 2007). 1
(Show Context)

Citation Context

... and RLI languages are relevant to phonotactic patterns. When hundreds of phonotactic patterns are collected and placed in the Chomsky Hierarchy, they all fall into the class of the regular languages =-=[21]-=-. More specifically, they are Noncounting, a class known not to be identifiable in limit from positive data [26, 22]. [21] hypothesizes all phonotactic patterns belong to a smaller class called neighb...

Learning and learnability in phonology

by Adam Albright , Bruce Hayes
"... A central scientific problem in phonology is how children rapidly and accurately acquire the intricate structures and patterns seen in the phonology of their native language. The solution to this problem lies in part in the discovery of the right formal theory of phonology, but another crucial eleme ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
A central scientific problem in phonology is how children rapidly and accurately acquire the intricate structures and patterns seen in the phonology of their native language. The solution to this problem lies in part in the discovery of the right formal theory of phonology, but another crucial element is the development of theories of learning, often in the form of machineimplemented models that attempt to mimic human childrens’ ability. This chapter is a survey of work in this area.
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University