## A tutorial on energy-based learning (2006)

Venue: | Predicting Structured Data |

Citations: | 42 - 6 self |

### BibTeX

@INPROCEEDINGS{Lecun06atutorial,

author = {Yann Lecun and Sumit Chopra and Raia Hadsell and Fu Jie Huang and G. Bakir and T. Hofman and B. Schölkopf and A. Smola and B. Taskar (eds},

title = {A tutorial on energy-based learning},

booktitle = {Predicting Structured Data},

year = {2006},

publisher = {MIT Press}

}

### Years of Citing Articles

### OpenURL

### Abstract

Energy-Based Models (EBMs) capture dependencies between variables by associating a scalar energy to each configuration of the variables. Inference consists in clamping the value of observed variables and finding configurations of the remaining variables that minimize the energy. Learning consists in finding an energy function in which observed configurations of the variables are given lower energies than unobserved ones. The EBM approach provides a common theoretical framework for many learning models, including traditional discriminative and generative approaches, as well as graph-transformer networks, conditional random fields, maximum margin Markov networks, and several manifold learning methods. Probabilistic models must be properly normalized, which sometimes requires evaluating intractable integrals over the space of all possible variable configurations. Since EBMs have no requirement for proper normalization, this problem is naturally circumvented. EBMs can be viewed as a form of non-probabilistic factor graphs, and they provide considerably more flexibility in the design of architectures and training criteria than probabilistic approaches. 1