Results 1  10
of
39
Harmonic grammar with linear programming: From linear . . .
, 2009
"... Harmonic Grammar (HG) is a model of linguistic constraint interaction in which wellformedness is calculated as the sum of weighted constraint violations. We show how linear programming algorithms can be used to determine whether there is a weighting for a set of constraints that fits a set of ling ..."
Abstract

Cited by 40 (9 self)
 Add to MetaCart
(Show Context)
Harmonic Grammar (HG) is a model of linguistic constraint interaction in which wellformedness is calculated as the sum of weighted constraint violations. We show how linear programming algorithms can be used to determine whether there is a weighting for a set of constraints that fits a set of linguistic data. The associated software package OTHelp provides a practical tool for studying large and complex linguistic systems in the HG framework and comparing the results with those of OT. We first describe the translation from Harmonic Grammars to systems solvable by linear programming algorithms. We then develop an HG analysis of ATR harmony in Lango that is, we argue, superior to the existing OT and rulebased treatments. We further highlight the usefulness of OTHelp, and the analytic power of HG, with a set of studies of the predictions HG makes for phonological typology.
Weighted Constraints in Generative Linguistics
 Cognitive Science
, 2009
"... Harmonic Grammar (HG) and Optimality Theory (OT) are closely related formal frameworks for the study of language. In both, the structure of a given language is determined by the relative strengths of a set of constraints. They differ in how these strengths are represented: as numerical weights (HG) ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
(Show Context)
Harmonic Grammar (HG) and Optimality Theory (OT) are closely related formal frameworks for the study of language. In both, the structure of a given language is determined by the relative strengths of a set of constraints. They differ in how these strengths are represented: as numerical weights (HG) or as ranks (OT). Weighted constraints have advantages for the construction of accounts of language learning and other cognitive processes, partly because they allow for the adaptation of connectionist and statistical models. HG has been little studied in generative linguistics, however, largely due to influential claims that weighted constraints make incorrect predictions about the typology of natural languages, predictions that are not shared by the more popular OT. This paper makes the case that HG is in fact a promising framework for typological research, and reviews and extends the existing arguments for weighted over ranked constraints. 1
Natural and Unnatural Constraints in Hungarian Vowel Harmony
 TO APPEAR IN LANGUAGE
, 2009
"... Phonological constraints can, in principle, be classified according to whether they are natural (founded in principles of Universal Grammar (UG)) or unnatural (arbitrary, learned inductively from the language data). Recent work has used this distinction as the basis for arguments about the role of ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
Phonological constraints can, in principle, be classified according to whether they are natural (founded in principles of Universal Grammar (UG)) or unnatural (arbitrary, learned inductively from the language data). Recent work has used this distinction as the basis for arguments about the role of UG in learning. Some languages have phonological patterns that arguably reflect unnatural constraints. With experimental testing, one can assess whether such patterns are actually learned by native speakers. Becker, Ketrez, and Nevins (2007), testing speakers of Turkish, suggest that they do indeed go unlearned. They interpret this result with a strong UG position: humans are unable to learn data patterns not backed by UG principles. This article pursues the same research line, locating similarly unnatural data patterns in the vowel harmony system of Hungarian, such as the tendency (among certain stem types) for a final bilabial stop to favor front harmony. Our own test leads to the opposite conclusion to Becker et al.: Hungarians evidently do learn the unnatural patterns. To conclude we consider a bias account—that speakers are able to learn unnatural environments, but devalue them relative to natural ones. We outline a method for testing the strength of constraints as learned by speakers against the strength of the corresponding patterns in the lexicon, and show that it offers tentative support for the hypothesis that unnatural constraints are disfavored by language learners.
Serial Harmonic Grammar and Berber Syllabification *
"... version of OT in which the representation is changed ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
(Show Context)
version of OT in which the representation is changed
Linking speech errors and phonological grammars: Insights from Harmonic Grammar networks
 Phonology
, 2009
"... Phonological grammars characterize distinctions between relatively wellformed (unmarked) and relatively illformed (marked) phonological structures. We review evidence that markedness influences speech error probabilities. Specifically, although errors result in both unmarked as well as marked stru ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Phonological grammars characterize distinctions between relatively wellformed (unmarked) and relatively illformed (marked) phonological structures. We review evidence that markedness influences speech error probabilities. Specifically, although errors result in both unmarked as well as marked structures, there is a markedness asymmetry: errors are more likely to produce unmarked outcomes. We show that stochastic disruption to the computational mechanisms realizing a Harmonic Grammar (HG) can account for the broad empirical patterns of speech errors. We demonstrate that our proposal can account for the general markedness asymmetry. We also develop methods for linking particular HG proposals to speech error distributions, and illustrate these methods using a simple HG and a set of initial consonant errors in English. 1. Phonological markedness and linguistic behavior * A central concern of generative phonological theory is to characterize the relative wellformedness of phonological structures. In this paper we will use the term markedness to refer to distinctions in wellformedness (where wellformed structures are
Some correct errordriven versions of the constraint demotion algorithm
, 2008
"... Abstract. This paper shows that ErrorDriven Constraint Demotion (EDCD), an errordriven learning algorithm proposed by Tesar (1995) for Prince and Smolensky’s (1993) version of Optimality Theory, can fail to converge to a totally ranked hierarchy of constraints, unlike the earlier nonerrordriven l ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Abstract. This paper shows that ErrorDriven Constraint Demotion (EDCD), an errordriven learning algorithm proposed by Tesar (1995) for Prince and Smolensky’s (1993) version of Optimality Theory, can fail to converge to a totally ranked hierarchy of constraints, unlike the earlier nonerrordriven learning algorithms proposed by Tesar and Smolensky (1993). The cause of the problem is found in Tesar’s use of “markpooling ties”, indicating that EDCD can be repaired by assuming Anttila’s (1997) “permuting ties ” instead. Simulations show that totally ranked hierarchies can indeed be found by
Variable affix order: grammar and learning
 Lg
, 2010
"... While affix ordering often reflects general syntactic or semantic principles, it can also be arbitrary or variable. This article develops a theory of morpheme ordering based on local morphotactic restrictions encoded as weighted bigram constraints. I examine the formal properties of morphotactic sys ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
While affix ordering often reflects general syntactic or semantic principles, it can also be arbitrary or variable. This article develops a theory of morpheme ordering based on local morphotactic restrictions encoded as weighted bigram constraints. I examine the formal properties of morphotactic systems, including arbitrariness, nontransitivity, contextsensitivity, analogy, and variation. Several variable systems are surveyed before turning to a detailed corpus study of a variable affix in Tagalog. Bigram morphotactics is shown to cover Tagalog and the typology, while other formalisms, such as alignment, precedence, and position classes, undergenerate. Moreover, learning simulations reveal that affix ordering under bigram morphotactics is subject to analogical pressures, providing a learningtheoretic motivation for the specific patterns of variation observed in Tagalog. I raise a different set of objections to rulebased approaches invoking affix movement. Finally, I demonstrate that bigram morphotactics is restrictive, being unable to generate unattested scenarios such as nonlocal contingency in ordering.*
The VC dimension of constraintbased grammars
, 2009
"... We analyze the complexity of Harmonic Grammar (HG), a linguistic model in which licit underlyingtosurfaceform mappings are determined by optimization over weighted constraints. We show that the VapnikChervonenkis Dimension of HG grammars with k constraints is k − 1. This establishes a fundamental ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We analyze the complexity of Harmonic Grammar (HG), a linguistic model in which licit underlyingtosurfaceform mappings are determined by optimization over weighted constraints. We show that the VapnikChervonenkis Dimension of HG grammars with k constraints is k − 1. This establishes a fundamental bound on the complexity of HG in terms of its capacity to classify sets of linguistic data that has significant ramifications for learnability. The VC dimension of HG is the same as that of Optimality Theory (OT), which is similar to HG, but uses ranked rather than weighted constraints in optimization. The parity of the VC dimension in these two models is somewhat surprising because OT defines finite classes of grammars—there are at most k! ways to rank k constraints—while HG can define infinite classes of grammars because the weights associated with constraints are realvalued. The parity is also surprising because HG permits groups of constraints that interact through socalled ‘gang effects’ to generate languages that cannot be generated in OT. The fact that the VC dimension grows linearly with the number of constraints in both models means that, even in the worst case, the number of randomly chosen training samples needed to weight/rank a known set of constrains is a linear function of k. We conclude that though there may be factors that favor one model or the other, the complexity of learning weightings/rankings is not one of them.
Frequency Biases in Phonological Variation
 TO APPEAR IN NLLT
, 2011
"... In the past two decades, variation has received a lot of attention in mainstream generative phonology, and several different models have been developed to account for variable phonological phenomena. However, all existing generative models of phonological variation account for the overall rate at ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
In the past two decades, variation has received a lot of attention in mainstream generative phonology, and several different models have been developed to account for variable phonological phenomena. However, all existing generative models of phonological variation account for the overall rate at which some process applies in a corpus, and therefore implicitly assume that all words are affected equally by a variable process. In this paper, we show that this is not the case. Many variable phenomena are more likely to apply to frequent than infrequent words. A model that accounts perfectly for the overall rate of application of some variable process therefore does not necessarily account very well for the actual application of the process to individual words. We illustrate this with two examples, English t/ddeletion and Japanese geminate devoicing. We then augment one existing generative model (noisy Harmonic Grammar) to incorporate the contribution of usage frequency to the application of variable processes. In this model, the influence of frequency is incorporated by scaling the weights of faithfulness constraints up or down for words of different frequencies. This augmented model accounts significantly better for variation than existing generative models.