Results 1  10
of
14
Dependent Hierarchical Beta Process for Image Interpolation and Denoising 1
"... A dependent hierarchical beta process (dHBP) is developed as a prior for data that may be represented in terms of a sparse set of latent features, with covariatedependent feature usage. The dHBP is applicable to general covariates and data models, imposing that signals with similar covariates are l ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
A dependent hierarchical beta process (dHBP) is developed as a prior for data that may be represented in terms of a sparse set of latent features, with covariatedependent feature usage. The dHBP is applicable to general covariates and data models, imposing that signals with similar covariates are likely to be manifested in terms of similar features. Coupling the dHBP with the Bernoulli process, and upon marginalizing out the dHBP, the model may be interpreted as a covariatedependent hierarchical Indian buffet process. As applications, we consider interpolation and denoising of an image, with covariates defined by the location of image patches within an image. Two types of noise models are considered: (i) typical white Gaussian noise; and (ii) spiky noise of arbitrary amplitude, distributed uniformly at random. In these examples, the features correspond to the atoms of a dictionary, learned based upon the data under test (without a priori training data). Stateoftheart performance is demonstrated, with efficient inference using hybrid Gibbs, MetropolisHastings and slice sampling.
A stickbreaking construction of the beta process (Technical Report
, 2009
"... We present and derive a new stickbreaking construction of the beta process. The construction is closely related to a special case of the stickbreaking construction of the Dirichlet process (Sethuraman, 1994) applied to the beta distribution. We derive an inference procedure that relies on Monte Ca ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
We present and derive a new stickbreaking construction of the beta process. The construction is closely related to a special case of the stickbreaking construction of the Dirichlet process (Sethuraman, 1994) applied to the beta distribution. We derive an inference procedure that relies on Monte Carlo integration to reduce the number of parameters to be inferred, and present results on synthetic data, the MNIST handwritten digits data set and a timeevolving gene expression data set. 1.
Nonparametric bayesian matrix completion
 In SAM
, 2010
"... Abstract—The BetaBinomial processes are considered for inferring missing values in matrices. The model moves beyond the lowrank assumption, modeling the matrix columns as residing in a nonlinear subspace. Largescale problems are considered via efficient Gibbs sampling, yielding predictions as wel ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract—The BetaBinomial processes are considered for inferring missing values in matrices. The model moves beyond the lowrank assumption, modeling the matrix columns as residing in a nonlinear subspace. Largescale problems are considered via efficient Gibbs sampling, yielding predictions as well as a measure of confidence in each prediction. Algorithm performance is considered for several datasets, with encouraging performance relative to existing approaches. I.
Graphical Models for Biclustering . . .
, 2012
"... The cell coordinates its biological response to the environment partly via the selective synthesis of thousands of unique RNA and protein molecules. Understanding the molecular biology of the cell is thus essential to the advancement of areas such as health care, agriculture, and energy production, ..."
Abstract
 Add to MetaCart
The cell coordinates its biological response to the environment partly via the selective synthesis of thousands of unique RNA and protein molecules. Understanding the molecular biology of the cell is thus essential to the advancement of areas such as health care, agriculture, and energy production, but requires the ability to simultaneously acquire information about thousands of molecules in a sample. Recent highthroughput measurement technologies address this concern. While being useful, they generate a high volume of data and bring in methodological challenges, effectively shifting the bottleneck in molecular biology research from data acquisition to data analysis. In particular, an important challenge is the genomewide
Books
, 2013
"... ◮ Scientists coauthoring the same paper ◮ Readers reading the same book ◮ Internet users posting a message on the same forum ◮ Customers buying the same item ..."
Abstract
 Add to MetaCart
◮ Scientists coauthoring the same paper ◮ Readers reading the same book ◮ Internet users posting a message on the same forum ◮ Customers buying the same item
Bayesian nonparametrics and the probabilistic approach to modelling
"... be thought of as a representation of possible data one could predict from a system. The probabilistic approach to modelling uses probability theory to express all aspects of uncertainty in the model. The probabilistic approach is synonymous with Bayesian modelling, which simply uses the rules of pro ..."
Abstract
 Add to MetaCart
be thought of as a representation of possible data one could predict from a system. The probabilistic approach to modelling uses probability theory to express all aspects of uncertainty in the model. The probabilistic approach is synonymous with Bayesian modelling, which simply uses the rules of probability theory in order to make predictions, compare alternative models, and learn model parameters and structure from data. This simple and elegant framework is most powerful when coupled with flexible probabilistic models. Flexibility is achieved through the use of Bayesian nonparametrics. This article provides an overview of probabilistic modelling and an accessible survey of some of the main tools in Bayesian nonparametrics. The survey covers the use of Bayesian nonparametrics for modelling unknown functions, density estimation, clustering, time series modelling, and representing sparsity, hierarchies, and covariance structure. More specifically it gives brief nontechnical overviews of Gaussian processes, Dirichlet processes, infinite hidden Markov models, Indian buffet processes, Kingman’s coalescent, Dirichlet diffusion trees, and Wishart processes. Key words: probabilistic modelling; Bayesian statistics; nonparametrics; machine learning. 1.
Dependent Normalized Random Measures
"... In this paper we propose two constructions of dependent normalized random measures, a class of nonparametric priors over dependent probability measures. Our constructions, which we call mixed normalized random measures (MNRM) and thinned normalized random measures (TNRM), involve (respectively) weig ..."
Abstract
 Add to MetaCart
In this paper we propose two constructions of dependent normalized random measures, a class of nonparametric priors over dependent probability measures. Our constructions, which we call mixed normalized random measures (MNRM) and thinned normalized random measures (TNRM), involve (respectively) weighting and thinning parts of a shared underlying Poisson process before combining them together. We show that both MNRM and TNRM are marginally normalized random measures, resulting in well understood theoretical properties. We develop marginal and slice samplers for both models, the latter necessary for inference in TNRM. In timevarying topic modeling experiments, both models exhibit superior performance over related dependent models such as the hierarchical Dirichlet process and the spatial normalized Gamma process. 1.
Learning Lowdimensional Signal Models  A Bayesian approach based on incomplete measurements
, 2011
"... Sampling, coding, and streaming even the most essential data, e.g., in medical imaging and weathermonitoring applications, produce a data deluge that severely stresses the available analogtodigital converter, communication bandwidth, and digitalstorage resources. Surprisingly, while the ambient ..."
Abstract
 Add to MetaCart
Sampling, coding, and streaming even the most essential data, e.g., in medical imaging and weathermonitoring applications, produce a data deluge that severely stresses the available analogtodigital converter, communication bandwidth, and digitalstorage resources. Surprisingly, while the ambient data dimension is large in many problems, the relevant information in the data can reside in a much lower dimensional space. This observation has led to several important theoretical © DIGITAL STOCK & LUSPHIX and algorithmic developments under different lowdimensional modeling frameworks, such as compressive sensing (CS) [1], [2], matrix completion [3], [4], and general factormodel representations [5], [6]. These approaches have enabled new measurement systems, tools, and methods for information extraction from dimensionalityreduced or incomplete data. A key aspect of maximizing the potential of such techniques is to develop appropriate data models. In this article, we investigate this challenge from the perspective of nonparametric Bayesian analysis. Before detailing the Bayesian modeling techniques, we review the form of measurements. Specifically, we consider measurement systems based on dimensionality reduction, where we linearly project the signal of interest into a lowerdimensional space via y Ux þ d: (1) The signal is x 2 R d, the measurements are y 2 R d0, U is a d0 3 d matrix with d05 d,anddaccounts for noise. Such a projection process loses signal information in general, since U has a nontrivial null space. Hence, there has been significant interest over the last few decades in finding dimensionality reductions that preserve as much information as possible in the incomplete measurements y about certain signals x. One way to preserve information is for U to provide a stable embedding that approximately preserves pairwise distances between all signals in some set of interest. In some cases, this property allows the recovery of x from its measurement y.