Results 1  10
of
95
Noname manuscript No. (will be inserted by the editor) HighDimensional Regression with Gaussian Mixtures and PartiallyLatent Response Variables
"... the date of receipt and acceptance should be inserted later Abstract In this work we address the problem of approximating highdimensional data with a lowdimensional representation. We make the following contributions. We propose an inverse regression method which exchanges the roles of input and ..."
Abstract
 Add to MetaCart
with noise models. The proposed probabilistic formulation could be viewed as a latentvariable augmentation of regression. We devise expectationmaximization (EM) procedures based on a data augmentation strategy which facilitates the maximumlikelihood search over the model parameters. We propose two
ality Reduction
"... Abstract The problem of approximating highdimensional data with a lowdimensional representation is addressed. The article makes the following contributions. An inverse regression framework is proposed, which exchanges the roles of input and response, such that the lowdimensional variable becomes ..."
Abstract
 Add to MetaCart
probabilistic formulation could be viewed as a latentvariable augmentation of regression. Expectationmaximization (EM) procedures are introduced, based on a data augmentation strategy which facilitates the maximumlikelihood search over the model parameters. Two augmentation schemes are proposed and the asso
Reduction
, 2014
"... Abstract The problem of approximating highdimensional data with a lowdimensional representation is addressed. The article makes the following contributions. An inverse regression framework is proposed, which exchanges the roles of input and response, such that the lowdimensional variable becomes ..."
Abstract
 Add to MetaCart
could be viewed as a latentvariable augmentation of regression. Expectationmaximization (EM) procedures are introduced, based on a data augmentation strategy which facilitates the maximumlikelihood search over the model parameters. Two augmentation schemes are proposed and the associated EM inference
The Art of Data Augmentation
, 2001
"... The term data augmentation refers to methods for constructing iterative optimization or sampling algorithms via the introduction of unobserved data or latent variables. For deterministic algorithms,the method was popularizedin the general statistical community by the seminal article by Dempster, Lai ..."
Abstract

Cited by 58 (4 self)
 Add to MetaCart
The term data augmentation refers to methods for constructing iterative optimization or sampling algorithms via the introduction of unobserved data or latent variables. For deterministic algorithms,the method was popularizedin the general statistical community by the seminal article by Dempster
A Constrained Latent Variable Model for Coreference Resolution
"... Coreference resolution is a well known clustering task in Natural Language Processing. In this paper, we describe the Latent Left Linking model (L3M), a novel, principled, and linguistically motivated latent structured prediction approach to coreference resolution. We show that L3M admits efficien ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
Coreference resolution is a well known clustering task in Natural Language Processing. In this paper, we describe the Latent Left Linking model (L3M), a novel, principled, and linguistically motivated latent structured prediction approach to coreference resolution. We show that L3M admits
Calculating posterior distributions and modal estimates in Markov mixture models
 Journal of Econometrics
, 1996
"... This paper is concerned with finite mixture models in which the populations from one observation to the next are selected according to an unobserved Markov process. A new, full Bayesian approach based on the method of Gibbs sampling is developed. Calculations are simplified by data augmentation, ach ..."
Abstract

Cited by 148 (9 self)
 Add to MetaCart
, achieved by introducing a population index variable into the list of unknown parameters. It is shown that the latent variables, one for each observation, can be simulated from their joint distribution given the data and the remaining parameters. This result serves to accelerate the convergence of the Gibbs
Bayesian inference for logistic models using polyagamma latent variables. arXiv preprint arXiv:1205.0310
, 2012
"... We propose a new dataaugmentation strategy for fully Bayesian inference in models with binomial likelihoods. The approach appeals to a new class of PólyaGamma distributions, which are constructed in detail. A variety of examples are presented to show the versatility of the method, including logi ..."
Abstract

Cited by 45 (6 self)
 Add to MetaCart
of the paper, we provide further details regarding the generation of PólyaGamma random variables; the empirical benchmarks reported in the main manuscript; and the extension of the basic dataaugmentation framework to contingency tables and multinomial outcomes. 1
Augmentation Schemes for Particle MCMC
"... Particle MCMC involves using a particle filter within an MCMC algorithm. For inference of a model which involves an unobserved stochastic process, the standard implementation uses the particle filter to propose new values for the stochastic process, and MCMC moves to propose new values for the param ..."
Abstract
 Add to MetaCart
for the parameters. We show how particle MCMC can be generalised beyond this. Our key idea is to introduce new latent variables. We then use the MCMC moves to update the latent variables, and the particle filter to propose new values for the parameters and stochastic process given the latent variables. A generic way
Augmented statistical models: Exploiting generative models in discriminative classifiers
 In NIPS workshops
, 2005
"... In recent years, many algorithms have been proposed for discriminative classification of data. Popular examples are support vector machines (SVMs) [1] and conditional random fields (CRFs) [2]. These techniques make extensive use of fixeddimensional mappings from the observationspace to a (often hi ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
conditional latentvariable generative models, such as Gaussian mixture models (GMMs) and hidden Markov models (HMMs). Bayes ’ rule is then used to calculate the posterior probability of the class labels. This allows missing data and variablelength sequences to be handled in a simple yet robust manner. However
Nonlinear Generative Embeddings for Kernels on Latent Variable Models
"... Generative embeddings use generative probabilistic models to project objects into a vectorial space of reduced dimensionality – where the socalled generative kernels can be defined. Some of these approaches employ generative models on latent variables to project objects into a feature space where t ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
operation, able to equilibrate the contributions of each latent variable of the model, thus augmenting the entropy of the latent variables vectors. The validity of the idea has been shown in the case of two generative kernels, which have been evaluated with tests on shape recognition and gesture
Results 1  10
of
95