## The Latent Maximum Entropy Principle (2002)

Venue: | In Proc. of ISIT |

Citations: | 17 - 3 self |

### BibTeX

@INPROCEEDINGS{Wang02thelatent,

author = {Shaojun Wang and Dale Schuurmans and Yunxin Zhao},

title = {The Latent Maximum Entropy Principle},

booktitle = {In Proc. of ISIT},

year = {2002}

}

### Years of Citing Articles

### OpenURL

### Abstract

We present an extension to Jaynes' maximum entropy principle that handles latent variables. The principle of latent maximum entropy we propose is di#erent from both Jaynes' maximum entropy principle and maximum likelihood estimation, but often yields better estimates in the presence of hidden variables and limited training data. We first show that solving for a latent maximum entropy model poses a hard nonlinear constrained optimization problem in general. However, we then show that feasible solutions to this problem can be obtained e#ciently for the special case of log-linear models---which forms the basis for an e#cient approximation to the latent maximum entropy principle. We derive an algorithm that combines expectation-maximization with iterative scaling to produce feasible log-linear solutions. This algorithm can be interpreted as an alternating minimization algorithm in the information divergence, and reveals an intimate connection between the latent maximum entropy and maximum likelihood principles.