Results 11  20
of
142
Sampling binary contingency tables with a greedy start (Preprint
, 2005
"... We study the problem of counting and randomly sampling binary contingency tables. For given row and column sums, we are interested in approximately counting (or sampling) 0/1 n×m matrices with the specified row/column sums. We present a simulated annealing algorithm with running time O((nm) 2 D 3 dm ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
We study the problem of counting and randomly sampling binary contingency tables. For given row and column sums, we are interested in approximately counting (or sampling) 0/1 n×m matrices with the specified row/column sums. We present a simulated annealing algorithm with running time O((nm) 2 D 3 dmax log 5 (n + m)) for any row/column sums where D is the number of nonzero entries and dmax is the maximum row/column sum. In the worst case, the running time of the algorithm is O(n 11 log 5 n) for an n × n matrix. This is the first algorithm to directly solve binary contingency tables for all row/column sums. Previous work reduced the problem to the permanent, or restricted attention to row/column sums that are close to regular. The interesting aspect of our simulated annealing algorithm is that it starts at a nontrivial instance, whose solution relies on the existence of short alternating paths in the graph constructed by a particular Greedy algorithm. 1
Gibbs sampling, exponential families and orthogonal polynomials
 Statistical Sciences
, 2008
"... Abstract. We give families of examples where sharp rates of convergence to stationarity of the widely used Gibbs sampler are available. The examples involve standard exponential families and their conjugate priors. In each case, the transition operator is explicitly diagonalizable with classical ort ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
Abstract. We give families of examples where sharp rates of convergence to stationarity of the widely used Gibbs sampler are available. The examples involve standard exponential families and their conjugate priors. In each case, the transition operator is explicitly diagonalizable with classical orthogonal polynomials as eigenfunctions. Key words and phrases: Gibbs sampler, running time analyses, exponential families, conjugate priors, location families, orthogonal polynomials, singular value decomposition. 1.
Preconditioning of Markov Chain Monte Carlo simulations using coarsescale models
 SIAM J. Sci. Comput
, 2006
"... We study the preconditioning of Markov Chain Monte Carlo (MCMC) methods using coarsescale models with applications to subsurface characterization. The purpose of preconditioning is to reduce the finescale computational cost and increase the acceptance rate in the MCMC sampling. This goal is achiev ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
We study the preconditioning of Markov Chain Monte Carlo (MCMC) methods using coarsescale models with applications to subsurface characterization. The purpose of preconditioning is to reduce the finescale computational cost and increase the acceptance rate in the MCMC sampling. This goal is achieved by generating Markov chains based on twostage computations. In the first stage, a new proposal is first tested by the coarsescale model based on multiscale finitevolume method. The full finescale computation will be conducted only if the proposal passes the coarsescale screening. For more efficient simulations, an approximation of the full finescale computation using precomputed multiscale basis functions can also be used. Comparing with the regular MCMC method, the preconditioned MCMC method generates a modified Markov chain by incorporating the coarsescale information of the problem. The conditions under which the modified Markov chain will converge to the correct posterior distribution are stated in the paper. The validity of these assumptions for our application, and the conditions which would guarantee a high acceptance rate are also discussed. We would like to note that coarsescale models used in the simulations need to be inexpensive, but not necessarily very accurate, as our analysis and numerical simulations demonstrate. We present numerical examples for sampling permeability fields using twopoint geostatistics. The KarhunenLoeve expansion is used to represent the realizations of the permeability field conditioned to the dynamic data, such as production data, as well as some static data. Our numerical examples show that the acceptance rate can be increased by more than ten times if MCMC simulations are preconditioned using coarsescale models.
Multimodal multispeaker probabilistic tracking in meetings
 in Proc. Int. Conf. on Multimodal Interfaces (ICMI
, 2005
"... Tracking speakers in multiparty conversations constitutes a fundamental task for automatic meeting analysis. In this paper, we present a probabilistic approach to jointly track the location and speaking activity of multiple speakers in a multisensor meeting room, equipped with a small microphone arr ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
Tracking speakers in multiparty conversations constitutes a fundamental task for automatic meeting analysis. In this paper, we present a probabilistic approach to jointly track the location and speaking activity of multiple speakers in a multisensor meeting room, equipped with a small microphone array and multiple uncalibrated cameras. Our framework is based on a mixedstate dynamic graphical model defined on a multiperson statespace, which includes the explicit definition of a proximitybased interaction model. The model integrates audiovisual (AV) data through a novel observation model. Audio observations are derived from a source localization algorithm. Visual observations are based on models of the shape and spatial structure of human heads. Approximate inference in our model, needed given its complexity, is performed with a Markov Chain Monte Carlo particle filter (MCMCPF), which results in high sampling efficiency. We present resultsbased on an objective evaluation procedurethat show that our framework (1) is capable of locating and tracking the position and speaking activity of multiple meeting participants engaged in real conversations with good accuracy; (2) can deal with cases of visual clutter and partial occlusion; and (3) significantly outperforms a traditional samplingbased approach.
Sampling the posterior: An approach to nongaussian data assimilation
, 2006
"... The viewpoint taken in this paper is that data assimilation is fundamentally a statistical problem and that this problem should be cast in a Bayesian framework. In the absence of model error, the correct solution to the data assimilation problem is to find the posterior distribution implied by this ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
The viewpoint taken in this paper is that data assimilation is fundamentally a statistical problem and that this problem should be cast in a Bayesian framework. In the absence of model error, the correct solution to the data assimilation problem is to find the posterior distribution implied by this Bayesian setting. Methods for dealing with data assimilation should then be judged by their ability to probe this distribution. In this paper we propose a range of techniques for probing the posterior distribution, based around the Langevin equation; and we compare these new techniques with existing methods. When the underlying dynamics is deterministic, the posterior distribution is on the space of initial conditions leading to a sampling problem over this space. When the underlying dynamics is stochastic the posterior distribution is on the space of continuous time paths. By writing down a density, and conditioning on observations, it is possible to define
A Bayesian Sampling Approach to Indoor Localization of Wireless Devices Using Received Signal Strength Indication
 In Proceedings of the Third IEEE International Conference on Pervasive Computing and Communications (PerCom 2005), Kauai Island
, 2005
"... This paper describes a probabilistic approach to global localization within an indoor environment with minimum infrastructure requirements. Global localization is a flavor of localization in which the device is unaware of its initial position and has to determine the same from scratch. Localization ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
This paper describes a probabilistic approach to global localization within an indoor environment with minimum infrastructure requirements. Global localization is a flavor of localization in which the device is unaware of its initial position and has to determine the same from scratch. Localization is performed based on the Received Signal Strength Indication (RSSI) as the only sensor reading, which is provided by most offtheshelf wireless network interface cards,. Location and orientation estimates are computed using Bayesian filtering on a sample set derived using MonteCarlo sampling. Research leading to the proposed method is outlined along with results and conclusions from simulations and real life experiments. 1.
Sequential Monte Carlo for Bayesian Computation
"... Sequential Monte Carlo (SMC) methods are a class of importance sampling and resampling techniques designed to simulate from a sequence of probability distributions. These approaches have become very popular over the last few years to solve sequential Bayesian inference problems (e.g. Doucet et al. 2 ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
Sequential Monte Carlo (SMC) methods are a class of importance sampling and resampling techniques designed to simulate from a sequence of probability distributions. These approaches have become very popular over the last few years to solve sequential Bayesian inference problems (e.g. Doucet et al. 2001). However, in comparison to Markov chain Monte Carlo (MCMC), the application of SMC remains limited when, in fact, such methods are also appropriate in such contexts (e.g. Chopin (2002); Del Moral et al. (2006)). In this paper, we present a simple unifying framework which allows us to extend both the SMC methodology and its range of applications. Additionally, reinterpreting SMC algorithms as an approximation of nonlinear MCMC kernels, we present alternative SMC and iterative selfinteracting approximation (Del Moral & Miclo 2004; 2006) schemes. We demonstrate the performance of the SMC methodology on static and sequential Bayesian inference problems.
THEORETICAL AND NUMERICAL COMPARISON OF SOME SAMPLING METHODS FOR MOLECULAR DYNAMICS
, 2005
"... The purpose of the present article is to compare different phasespace sampling methods, such as purely stochastic methods (Rejection method, Metropolized independence sampler, Importance Sampling), stochastically perturbed Molecular Dynamics (Hybrid Monte Carlo, Langevin Dynamics, Biased Random Wal ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
The purpose of the present article is to compare different phasespace sampling methods, such as purely stochastic methods (Rejection method, Metropolized independence sampler, Importance Sampling), stochastically perturbed Molecular Dynamics (Hybrid Monte Carlo, Langevin Dynamics, Biased Random Walk), and purely deterministic methods (NoséHoover chains, NoséPoincaré and Recursive Multiple Thermostats (RMT) methods). After recalling some theoretical convergence properties for the various methods, we provide some new convergence results for the Hybrid Monte Carlo scheme, requiring weaker (and easier to check) conditions than previously known conditions. We then turn to the numerical efficiency of the sampling schemes for a benchmark model of linear alkane molecules. In particular, the numerical distributions that are generated are compared in a systematic way, on the basis of some quantitative convergence indicators.
Parsing Images into Regions, Curves, and Curve Groups
"... In this paper, we present an algorithm for parsing natural images into middle level vision representations – regions, curves, and curve groups (parallel curves and trees). This algorithm is targeted for an integrated solution to image segmentation and curve grouping through Bayesian inference. The ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
In this paper, we present an algorithm for parsing natural images into middle level vision representations – regions, curves, and curve groups (parallel curves and trees). This algorithm is targeted for an integrated solution to image segmentation and curve grouping through Bayesian inference. The paper makes the following contributions. (1) It adopts a layered (or 2.1Dsketch) representation integrating both region and curve models which compete to explain an input image. The curve layer occludes the region layer and curves observe a partial order occlusion relation. (2) A Markov chain search scheme Metropolized Gibbs Samplers (MGS) is studied. It consists of several pairs of reversible jumps to traverse the complex solution space. An MGS proposes the next state within the jump scope of the current state according to a conditional probability like a Gibbs sampler and then accepts the proposal with a MetropolisHastings step. This paper discusses systematic design strategies of devising reversible jumps for a complex inference task. (3) The proposal probability ratios in jumps are factorized into ratios of discriminative probabilities. The latter are computed in a bottomup process, and they drive the Markov chain dynamics in a datadriven Markov chain Monte Carlo framework. We demonstrate the performance of the algorithm in experiments with a number of natural images.
Bayesian Hierarchical Modeling for Integrating LowAccuracy and HighAccuracy Experiments
 Technometrics
, 2008
"... Standard practice in analyzing data from different types of experiments is to treat data from each type separately. By borrowing strength across multiple sources, an integrated analysis can produce better results. Careful adjustments need to be made to incorporate the systematic differences among va ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Standard practice in analyzing data from different types of experiments is to treat data from each type separately. By borrowing strength across multiple sources, an integrated analysis can produce better results. Careful adjustments need to be made to incorporate the systematic differences among various experiments. To this end, some Bayesian hierarchical Gaussian process models (BHGP) are proposed. The heterogeneity among different sources is accounted for by performing flexible location and scale adjustments. The approach tends to produce prediction closer to that from the highaccuracy experiment. The Bayesian computations are aided by the use of Markov chain Monte Carlo and Sample Average Approximation algorithms. The proposed method is illustrated with two examples: one with detailed and approximate finite elements simulations for mechanical material design and the other with physical and computer experiments for modeling a food processor.