Results 1  10
of
25
An Efficient Probabilistic ContextFree Parsing Algorithm that Computes Prefix Probabilities
 Computational Linguistics
, 2002
"... this article can compute solutions to all four of these problems in a single flamework, with a number of additional advantages over previously presented isolated solutions ..."
Abstract

Cited by 187 (5 self)
 Add to MetaCart
this article can compute solutions to all four of these problems in a single flamework, with a number of additional advantages over previously presented isolated solutions
Design of a Linguistic Postprocessor using Variable Memory Length Markov Models
 In International Conference on Document Analysis and Recognition
, 1995
"... We present the design of a linguistic postprocessor for character recognizers. The central module of our system is a trainable variable memory length Markov model (VLMM) which predicts the next character given a variable length window of past characters. The overall system is composed of several fin ..."
Abstract

Cited by 48 (1 self)
 Add to MetaCart
We present the design of a linguistic postprocessor for character recognizers. The central module of our system is a trainable variable memory length Markov model (VLMM) which predicts the next character given a variable length window of past characters. The overall system is composed of several finite state automata, including the main VLMM and a proper noun VLMM. The best model reported in the literature (Brown et al 1992) achieves 1.75 bits per character on the Brown corpus. On that same corpus, our model, trained on 10 times less data, reaches 2.19 bits per character and is 200 times smaller (_ 160,000 parameters). The model was designed for handwriting recognition applications but can be used for other OCR problems and speech recognition.
Generalized queries on probabilistic contextfree grammars
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1998
"... Abstract—Probabilistic contextfree grammars (PCFGs) provide a simple way to represent a particular class of distributions over sentences in a contextfree language. Efficient parsing algorithms for answering particular queries about a PCFG (i.e., calculating the probability of a given sentence, or ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
Abstract—Probabilistic contextfree grammars (PCFGs) provide a simple way to represent a particular class of distributions over sentences in a contextfree language. Efficient parsing algorithms for answering particular queries about a PCFG (i.e., calculating the probability of a given sentence, or finding the most likely parse) have been developed and applied to a variety of patternrecognition problems. We extend the class of queries that can be answered in several ways: (1) allowing missing tokens in a sentence or sentence fragment, (2) supporting queries about intermediate structure, such as the presence of particular nonterminals, and (3) flexible conditioning on a variety of types of evidence. Our method works by constructing a Bayesian network to represent the distribution of parse trees induced by a given PCFG. The network structure mirrors that of the chart in a standard parser, and is generated using a similar dynamicprogramming approach. We present an algorithm for constructing Bayesian networks from PCFGs, and show how queries or patterns of queries on the network correspond to interesting queries on PCFGs. The network formalism also supports extensions to encode various context sensitivities within the probabilistic dependency structure. Index Terms—Probabilistic contextfree grammars, Bayesian networks.
Consistency of Stochastic ContextFree Grammars from Probabilistic Estimation based on Growth Transformations
, 1997
"... An important problem related to the probabilistic estimation of Stochastic ContextFree Grammars (SCFGs) is guaranteeing the consistency of the estimated model. ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
An important problem related to the probabilistic estimation of Stochastic ContextFree Grammars (SCFGs) is guaranteeing the consistency of the estimated model.
Computational Complexity of Problems on Probabilistic Grammars and Transducers.
 In Proc. ICGI
, 2000
"... Determinism plays an important role in grammatical inference. ..."
Abstract

Cited by 19 (3 self)
 Add to MetaCart
Determinism plays an important role in grammatical inference.
Probabilistic FiniteState Machines  Part I
"... Probabilistic finitestate machines are used today in a variety of areas in pattern recognition, or in fields to which pattern recognition is linked: computational linguistics, machine learning, time series analysis, circuit testing, computational biology, speech recognition and machine translatio ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
Probabilistic finitestate machines are used today in a variety of areas in pattern recognition, or in fields to which pattern recognition is linked: computational linguistics, machine learning, time series analysis, circuit testing, computational biology, speech recognition and machine translation are some of them. In part I of this paper we survey these generative objects and study their definitions and properties. In part II, we will study the relation of probabilistic finitestate automata with other well known devices that generate strings as hidden Markov models and ngrams, and provide theorems, algorithms and properties that represent a current state of the art of these objects.
Automatic Acquisition of Language Models for Speech Recognition
, 1994
"... This thesis focuses on the automatic acquisition of language structure and the subsequent use of the learned language structure to improve the performance of a speech recognition system. First, we develop a grammar inference process which is able to learn a grammar describing a large set of training ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
This thesis focuses on the automatic acquisition of language structure and the subsequent use of the learned language structure to improve the performance of a speech recognition system. First, we develop a grammar inference process which is able to learn a grammar describing a large set of training sentences. The process of acquiring this grammar is one of generalization so that the resulting grammar predicts likely sentences beyond those contained in the training set. From the grammar we construct a novel probabilistic language model called the phrase class ngram model (pcng), which is a natural generalization of the word class ngram model [11] to phrase classes. This model utilizes the grammar in such a way that it maintains full coverage of any test set while at the same time reducing the complexity, or number of parameters, of the resulting predictive model. Positive results are shown in terms of perplexity of the acquired phrase class ngram models and in terms of reduction of ...
Probabilistic finitestate machinespart II
 IEEE Trans. Pattern Anal. Mach. Intell
"... Abstract—Probabilistic finitestate machines are used today in a variety of areas in pattern recognition, or in fields to which pattern recognition is linked: computational linguistics, machine learning, time series analysis, circuit testing, computational biology, speech recognition, and machine tr ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
Abstract—Probabilistic finitestate machines are used today in a variety of areas in pattern recognition, or in fields to which pattern recognition is linked: computational linguistics, machine learning, time series analysis, circuit testing, computational biology, speech recognition, and machine translation are some of them. In Part I of this paper, we survey these generative objects and study their definitions and properties. In Part II, we will study the relation of probabilistic finitestate automata with other wellknown devices that generate strings as hidden Markov models and ngrams and provide theorems, algorithms, and properties that represent a current state of the art of these objects. Index Terms—Automata, classes defined by grammars or automata, machine learning, language acquisition, language models, language parsing and understanding, machine translation, speech recognition and synthesis, structural pattern recognition, syntactic pattern recognition. æ 1
Learning Of Stochastic ContextFree Grammars By Means Of Estimation Algorithms
, 1999
"... The use of the InsideOutside (IO) algorithm for the estimation of the probability distributions of Stochastic ContextFree Grammars is characterized by the use of all the derivations in the learning process. However, its application in real tasks for Language Modeling is restricted due to the that ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
The use of the InsideOutside (IO) algorithm for the estimation of the probability distributions of Stochastic ContextFree Grammars is characterized by the use of all the derivations in the learning process. However, its application in real tasks for Language Modeling is restricted due to the that it needs to converge. Alternatively, several estimations algorithms which consider a certain subset of derivations in the estimation process have been proposed elsewhere. This set of derivations can be chosen according to structural criteria, or by selecting the kbest derivations. These alternatives are studied in this paper, and they are tested on the corpus of the Wall Street Journal processed in the Penn Treebank project.