; School of Computer Science; Carnegie Mellon University; y
SVM HeaderParse 0.2
; Pittsburgh, PA 15213; Current address: D'ept. Math., Universit'e Jean Monnet, 23, rue P. Michelon,; 42023 Saint-Etienne cedex, France,
SVM HeaderParse 0.1
This paper introduces lattice based language models, a new language modeling paradigm. These models construct multi-dimensional hierarchies of partitions and select the most promising partitions to generate the estimated distributions. We discussed a specific two dimensional lattice and propose two primary features to measure the usefulness of each node: the training-set history count and the smoothed entropy of its prediction. Smoothing techniques are reviewed and a generalization of the conventional backoff strategy to multiple dimensions is proposed. Preliminary experimental results are obtained on the SWITCHBOARD corpus which lead to a 6.5 % perplexity reduction over a word trigram model.