Results 1 
1 of
1
ContextSensitive Statistics for Improved Grammatical Language Models
 In Proceedings of the Twelfth National Conference on Artificial Intelligence
, 1994
"... We develop a language model using probabilistic contextfree grammars (PCFGs) that is "pseudo contextsensitive" in that the probability that a nonterminal N expands using a rule r depends on N 's parent. We derive the equations for estimating the necessary probabilities using a variant of the insi ..."
Abstract

Cited by 44 (4 self)
 Add to MetaCart
We develop a language model using probabilistic contextfree grammars (PCFGs) that is "pseudo contextsensitive" in that the probability that a nonterminal N expands using a rule r depends on N 's parent. We derive the equations for estimating the necessary probabilities using a variant of the insideoutside algorithm. We give experimental results showing that, beginning with a highperformance PCFG, one can develop a pseudo PCSG that yields significant performance gains. Analysis shows that the benefits from the contextsensitive statistics are localized, suggesting that we can use them to extend the original PCFG. Experimental results confirm that this is both feasible and the resulting grammar retains the performance gains. This implies that our scheme may be useful as a novel method for PCFG induction. 1 Introduction Like its nonstochastic brethren, probabilistic parsing has been based upon contextfree grammars (CFGs), and for similar reasons: CFGs support a simple and efficien...