Results 1 
6 of
6
Distance Dependent Infinite Latent Feature Models
, 2011
"... Latent feature models are widely used to decompose data into a small number of components. Bayesian nonparametric variants of these models, which use the Indian buffet process (IBP) as a prior over latent features, allow the number of features to be determined from the data. We present a generalizat ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
Latent feature models are widely used to decompose data into a small number of components. Bayesian nonparametric variants of these models, which use the Indian buffet process (IBP) as a prior over latent features, allow the number of features to be determined from the data. We present a generalization of the IBP, the distance dependent Indian buffet process (ddIBP), for modeling nonexchangeable data. It relies on a distance function defined between data points, biasing nearby data to share more features. The choice of distance function allows for many kinds of dependencies, including temporal or spatial. Further, the original IBP is a special case of the ddIBP. In this paper, we develop the ddIBP and theoretically characterize the distribution of how features are shared between data. We derive a Markov chain Monte Carlo sampler for a linear Gaussian model with a ddIBP prior and study its performance on several data sets for which exchangeability is not a reasonable assumption.
A survey of nonexchangeable priors for Bayesian nonparametric models
, 2014
"... Dependent nonparametric processes extend distributions over measures, such as the Dirichlet process and the beta process, to give distributions over collections of measures, typically indexed by values in some covariate space. Such models are appropriate priors when exchangeability assumptions do ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Dependent nonparametric processes extend distributions over measures, such as the Dirichlet process and the beta process, to give distributions over collections of measures, typically indexed by values in some covariate space. Such models are appropriate priors when exchangeability assumptions do not hold, and instead we want our model to vary fluidly with some set of covariates. Since the concept of dependent nonparametric processes was formalized by MacEachern [1], there have been a number of models proposed and used in the statistics and machine learning literatures. Many of these models exhibit underlying similarities, an understanding of which, we hope, will help in selecting an appropriate prior, developing new models, and leveraging inference techniques.
A unifying representation for a class of dependent random measures
, 1211
"... We present a general construction for dependent random measures based on thinning Poisson processes on an augmented space. The framework is not restricted to dependent versions of a specific nonparametric model, but can be applied to all models that can be represented using completely random measure ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
We present a general construction for dependent random measures based on thinning Poisson processes on an augmented space. The framework is not restricted to dependent versions of a specific nonparametric model, but can be applied to all models that can be represented using completely random measures. Several existing dependent random measures can be seen as specific cases of this framework. Interesting properties of the resulting measures are derived and the efficacy of the framework is demonstrated by constructing a covariatedependent latent feature model and topic model that obtain superior predictive performance. 1
Metadata Dependent Mondrian Processes
"... Stochastic partition processes in a product space play an important role in modeling relational data. Recent studies on the Mondrian process have introduced more flexibility into the block structure in relational models. A sideeffect of such high flexibility is that, in data sparsity scenarios, t ..."
Abstract
 Add to MetaCart
(Show Context)
Stochastic partition processes in a product space play an important role in modeling relational data. Recent studies on the Mondrian process have introduced more flexibility into the block structure in relational models. A sideeffect of such high flexibility is that, in data sparsity scenarios, the model is prone to overfit. In reality, relational entities are always associated with meta information, such as user profiles in a social network. In this paper, we propose a metadata dependent Mondrian process (MDMP) to incorporate meta information into the stochastic partition process in the product space and the entity allocation process on the resulting block structure. MDMP can not only encourage homogeneous relational interactions within blocks but also discourage metalabel diversity within blocks. Regularized by meta information, MDMP becomes more robust in data sparsity scenarios and easier to converge in posterior inference. We apply MDMP to link prediction and rating prediction and demonstrate that MDMP is more effective than the baseline models in prediction accuracy with a more parsimonious model structure. 1.
MONDRIAN HIDDEN MARKOVMODEL FOR MUSIC SIGNAL PROCESSING
"... This paper discusses a new extension of hidden Markov models that can capture clusters embedded in transitions between the hidden states. In our model, the statetransition matrices are viewed as representations of relational data reflecting a network structure between the hidden states. We specifi ..."
Abstract
 Add to MetaCart
(Show Context)
This paper discusses a new extension of hidden Markov models that can capture clusters embedded in transitions between the hidden states. In our model, the statetransition matrices are viewed as representations of relational data reflecting a network structure between the hidden states. We specifically present a nonparametric Bayesian approach to the proposed statespace model whose network structure is represented by a Mondrian Processbased relational model. We show an application of the proposed model to music signal analysis through some experimental results. Index Terms — Bayesian nonparametrics, hiddenMarkov model, Mondrian process
Poisson Latent Feature Calculus for Generalized Indian Buffet Processes
, 2014
"... The purpose of this work is to describe a unified, and indeed simple, mechanism for nonparametric Bayesian analysis, construction and generative sampling of a large class of latent feature models which one can describe as generalized notions of Indian Buffet Processes (IBP). This is done via the P ..."
Abstract
 Add to MetaCart
(Show Context)
The purpose of this work is to describe a unified, and indeed simple, mechanism for nonparametric Bayesian analysis, construction and generative sampling of a large class of latent feature models which one can describe as generalized notions of Indian Buffet Processes (IBP). This is done via the Poisson Process Calculus as it now relates to latent feature models. The IBP, first arising in a Bayesian Machine Learning context, was ingeniously devised by Griffiths and Ghahramani in (2005) and its generative scheme is cast in terms of customers entering sequentially an Indian Buffet restaurant and selecting previously sampled dishes as well as new dishes. In this metaphor dishes corresponds to latent features, attributes, preferences shared by individuals. The IBP, and its generalizations, represent an exciting class of models well suited to handle high dimensional statistical problems now common in this information age. In a survey article Griffiths and Ghahramani note applications for Choice models, modeling protein interactions, independent components analysis and sparse factor