• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

DMCA

Lipschitz Parametrization of Probabilistic Graphical Models

Cached

  • Download as a PDF

Download Links

  • [www.cs.sunysb.edu]
  • [www.cs.sunysb.edu]
  • [www.cs.stonybrook.edu]
  • [www.cs.purdue.edu]
  • [people.csail.mit.edu]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Jean Honorio
Citations:4 - 0 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Honorio_lipschitzparametrization,
    author = {Jean Honorio},
    title = {Lipschitz Parametrization of Probabilistic Graphical Models},
    year = {}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

We show that the log-likelihood of several probabilistic graphical models is Lipschitz continuous with respect to the ℓp-norm of the parameters. We discuss several implications of Lipschitz parametrization. We present an upper bound of the Kullback-Leibler divergence that allows understanding methods that penalize the ℓp-norm of differences of parameters as the minimization of that upper bound. The expected log-likelihood is lower bounded by the negative ℓp-norm, which allows understanding the generalization ability of probabilistic models. The exponential of the negative ℓp-norm is involved in the lower bound of the Bayes error rate, which shows that it is reasonable to use parameters as features in algorithms that rely on metric spaces (e.g. classification, dimensionality reduction, clustering). Our results do not rely on specific algorithms for learning the structure or parameters. We show preliminary results for activity recognition and temporal segmentation. 1

Keyphrases

lipschitz parametrization    probabilistic graphical model    upper bound    negative p-norm    specific algorithm    several implication    several probabilistic graphical model    probabilistic model    activity recognition    preliminary result    bayes error rate    generalization ability    dimensionality reduction    kullback-leibler divergence    metric space    temporal segmentation    expected log-likelihood   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University