Results 1  10
of
103,380
Infinite Mixtures of Trees
 ICML2007
, 2007
"... Finite mixtures of treestructured distributions have been shown to be efficient and effective in modeling multivariate distributions. Using Dirichlet processes, we extend this approach to allow countably many treestructured mixture components. The resulting Bayesian framework allows us to deal with ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Finite mixtures of treestructured distributions have been shown to be efficient and effective in modeling multivariate distributions. Using Dirichlet processes, we extend this approach to allow countably many treestructured mixture components. The resulting Bayesian framework allows us to deal
Infinite Mixtures of Gaussian Process Experts
 In Advances in Neural Information Processing Systems 14
, 2001
"... We present an extension to the Mixture of Experts (ME) model, where the individual experts are Gaussian Process (GP) regression models. Using an inputdependent adaptation of the Dirichlet Process, we implement a gating network for an infinite number of Experts. Inference in this model may be do ..."
Abstract

Cited by 107 (6 self)
 Add to MetaCart
We present an extension to the Mixture of Experts (ME) model, where the individual experts are Gaussian Process (GP) regression models. Using an inputdependent adaptation of the Dirichlet Process, we implement a gating network for an infinite number of Experts. Inference in this model may
Flexible Priors for Infinite Mixture Models
"... Most infinite mixture models in the current literature are based on the Dirichlet process prior. This prior on partitions implies a very specific (a priori) distribution on cluster sizes. A slightly more general prior known as the PitmanYor process prior generalizes this to a twoparameter family. ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Most infinite mixture models in the current literature are based on the Dirichlet process prior. This prior on partitions implies a very specific (a priori) distribution on cluster sizes. A slightly more general prior known as the PitmanYor process prior generalizes this to a twoparameter family
An alternative infinite mixture of gaussian process experts
 In Advances In Neural Information Processing Systems
"... We present an infinite mixture model in which each component comprises a multivariate Gaussian distribution over an input space, and a Gaussian Process model over an output space. Our model is neatly able to deal with nonstationary covariance functions, discontinuities, multimodality and overlappin ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
We present an infinite mixture model in which each component comprises a multivariate Gaussian distribution over an input space, and a Gaussian Process model over an output space. Our model is neatly able to deal with nonstationary covariance functions, discontinuities, multimodality
Denoising with infinite mixture of Gaussians,” presented at the Eur
 Signal Process. Conf. (EUSIPCO
, 2005
"... We show in this paper how an Infinite Mixture of Gaussians model can be used to estimate/denoise nonGaussian data with local linear estimators based on the Wiener filter. The decomposition of the data in Gaussian components is straightforwardly computed with the Gaussian Transform, previously deriv ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We show in this paper how an Infinite Mixture of Gaussians model can be used to estimate/denoise nonGaussian data with local linear estimators based on the Wiener filter. The decomposition of the data in Gaussian components is straightforwardly computed with the Gaussian Transform, previously
Expectation propagation for infinite mixtures (Extended abstract)
, 2003
"... This note describes a method for approximate inference in infinite models that uses deterministic Expectation Propagation instead of Monte Carlo. For infinite Gaussian mixtures, the algorithm provides cluster parameter estimates, cluster memberships, and model evidence. Model parameters, such as the ..."
Abstract
 Add to MetaCart
This note describes a method for approximate inference in infinite models that uses deterministic Expectation Propagation instead of Monte Carlo. For infinite Gaussian mixtures, the algorithm provides cluster parameter estimates, cluster memberships, and model evidence. Model parameters
Infinite mixtures for multirelational categorical data
"... Large relational datasets are prevalent in many fields. We propose an unsupervised component model for relational data, i.e., for heterogeneous collections of categorical cooccurrences. The cooccurrences can be dyadic or nadic, and over the same or different categorical variables. Graphs are a sp ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Large relational datasets are prevalent in many fields. We propose an unsupervised component model for relational data, i.e., for heterogeneous collections of categorical cooccurrences. The cooccurrences can be dyadic or nadic, and over the same or different categorical variables. Graphs are a special case, as collections of dyadic cooccurrences (edges) over a set of vertices. The model is simple, with only one latent variable. This allows wide applicability as long as a global latent component solution is preferred, and the generative process fits the application. Estimation with a collapsed Gibbs sampler is straightforward. We demostrate the model with graphs enriched with multinomial vertex properties, or more conceretely, with two sets of scientific papers, with both content and citation information available. 1.
Twolevel infinite mixture for multidomain data
"... The combined, unsupervised analysis of coupled data sources is an open problem in machine learning. A particularly important example from the biological domain is the analysis of mRNA and protein profiles derived from the same set of genes (either over time or under different conditions). Such analy ..."
Abstract
 Add to MetaCart
The combined, unsupervised analysis of coupled data sources is an open problem in machine learning. A particularly important example from the biological domain is the analysis of mRNA and protein profiles derived from the same set of genes (either over time or under different conditions). Such analysis has the potential to provide a far more comprehensive picture of the mechanisms of
CONVERGENCE OF LATENT MIXING MEASURES IN FINITE AND INFINITE MIXTURE MODELS
, 2013
"... This paper studies convergence behavior of latent mixing measures that arise in finite and infinite mixture models, using transportation distances (i.e., Wasserstein metrics). The relationship between Wasserstein distances on the space of mixing measures and fdivergence functionals such as Hellinge ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
This paper studies convergence behavior of latent mixing measures that arise in finite and infinite mixture models, using transportation distances (i.e., Wasserstein metrics). The relationship between Wasserstein distances on the space of mixing measures and fdivergence functionals
Results 1  10
of
103,380