Results 1  10
of
242
An Introduction to MCMC for Machine Learning
, 2003
"... This purpose of this introductory paper is threefold. First, it introduces the Monte Carlo method with emphasis on probabilistic machine learning. Second, it reviews the main building blocks of modern Markov chain Monte Carlo simulation, thereby providing and introduction to the remaining papers of ..."
Abstract

Cited by 222 (2 self)
 Add to MetaCart
This purpose of this introductory paper is threefold. First, it introduces the Monte Carlo method with emphasis on probabilistic machine learning. Second, it reviews the main building blocks of modern Markov chain Monte Carlo simulation, thereby providing and introduction to the remaining papers of this special issue. Lastly, it discusses new interesting research horizons.
Modelling and interpretation of architecture from several images
"... The modelling of 3dimensional (3D) environments has become a requirement for many applications in engineering design, virtual reality, visualisation and entertainment. However the scale and complexity demanded from such models has risen to the point where the acquisition of 3D models can require a ..."
Abstract

Cited by 83 (6 self)
 Add to MetaCart
The modelling of 3dimensional (3D) environments has become a requirement for many applications in engineering design, virtual reality, visualisation and entertainment. However the scale and complexity demanded from such models has risen to the point where the acquisition of 3D models can require a vast amount of specialist time and equipment. Because of this much research has been undertaken in the computer vision community into automating all or part of the process of acquiring a 3D model from a sequence of images. This thesis focuses specifically on the automatic acquisition of architectural models from short image sequences. An architectural model is defined as a set of planes corresponding to walls which contain a variety of labelled primitives such as doors and windows. As well as a label defining its type, each primitive contains parameters defining its shape and texture. The key advantage of this representation is that the model defines not only geometry and texture, but also an interpretation of the scene. This is crucial as it enables reasoning about the scene; for instance, structure and texture can be inferred in areas of the model which are unseen in any
Monte Carlo Methods for Tempo Tracking and Rhythm Quantization
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 2003
"... We present a probabilistic generarive model for timing deviations in expressive music performance. The structure of the proposed model is equivalent to a switching state space model. The switch variables correspond to discrete note locations as in a musical score. The continuous hidden variables ..."
Abstract

Cited by 54 (9 self)
 Add to MetaCart
We present a probabilistic generarive model for timing deviations in expressive music performance. The structure of the proposed model is equivalent to a switching state space model. The switch variables correspond to discrete note locations as in a musical score. The continuous hidden variables denote the tempo. We formulate two well known music recognition problems, namely tempo tracking and automatic transcription (rhythm quantization) as filtering and maximum a posteriori (MAP) state estimation tasks. Ex act computation of posterior features such as the MAP state is intractable in this model class, so we introduce Monte Carlo methods for integration and optimization. We compare Markov Chain Monte Carlo (MCMC) methods (such as Gibbs sampling, simulated annealing and iterative improvement) and sequential Monte Carlo methods (particle filters). Our simulation results suggest better results with sequential methods. The methods can be applied in both online and batch scenarios such as tempo tracking and transcription and are thus potentially useful in a number of music applications such as adaptive automatic accompaniment, score typesetting and music information retrieval.
FINDING STRUCTURE WITH RANDOMNESS: PROBABILISTIC ALGORITHMS FOR CONSTRUCTING APPROXIMATE MATRIX DECOMPOSITIONS
"... Abstract. Lowrank matrix approximations, such as the truncated singular value decomposition and the rankrevealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys and extends recent research which demonstrates that randomization offers a powerful t ..."
Abstract

Cited by 40 (0 self)
 Add to MetaCart
Abstract. Lowrank matrix approximations, such as the truncated singular value decomposition and the rankrevealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing lowrank matrix approximation. These techniques exploit modern computational architectures more fully than classical methods and open the possibility of dealing with truly massive data sets. This paper presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions. These methods use random sampling to identify a subspace that captures most of the action of a matrix. The input matrix is then compressed—either explicitly or implicitly—to this subspace, and the reduced matrix is manipulated deterministically to obtain the desired lowrank factorization. In many cases, this approach beats its classical competitors in terms of accuracy, speed, and robustness. These claims are supported by extensive numerical experiments and a detailed error analysis. The specific benefits of randomized techniques depend on the computational environment. Consider the model problem of finding the k dominant components of the singular value decomposition
Monte Carlo sampling of solutions to inverse problems
 J. geophys. Res
, 1995
"... Probabilistic formulation of inverse problems leads to the definition of a probability distribution in the model space. This probability distribution combines a priori information with new information obtained by measuring some observable parameters (data). As, in the general case, the theory linkin ..."
Abstract

Cited by 39 (7 self)
 Add to MetaCart
Probabilistic formulation of inverse problems leads to the definition of a probability distribution in the model space. This probability distribution combines a priori information with new information obtained by measuring some observable parameters (data). As, in the general case, the theory linking data with model parameters is nonlinear, the a posteriori probability in the model space may not be easy to describe (it may be multimodal, some moments may not be defined, etc.). When analyzing an inverse problem, obtaining a maximum likelihood model is usually not sufficient, as we normally also wish to have information on the resolution power of the data. In the general case we may have a large number of model parameters, and an inspection of the marginal probability densities of interest may be impractical, or even useless. But it is possible to pseudorandomly generate a large collection of models according to the posterior probability distribution and to analyze and display the models in such a way that information on the relative likelihoods of model properties is conveyed to the spectator. This can be accomplished by means of an efficient Monte Carlo method, even in cases where no explicit formula for the a priori distribution is available. The most well known importance sampling method, the Metropolis algorithm, can be generalized, and this gives a method that allows analysis of (possibly highly nonlinear) inverse problems with complex a priori information and data with an arbitrary noise distribution.
Hypercube Sampling and the Propagation of Uncertainty in Analyses of Complex Systems
, 2002
"... ..."
Frailty Correlated Default
, 2008
"... This paper shows that the probability of extreme default losses on portfolios of U.S. corporate debt is much greater than would be estimated under the standard assumption that default correlation arises only from exposure to observable risk factors. At the high confidence levels at which bank loan p ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
This paper shows that the probability of extreme default losses on portfolios of U.S. corporate debt is much greater than would be estimated under the standard assumption that default correlation arises only from exposure to observable risk factors. At the high confidence levels at which bank loan portfolio and CDO default losses are typically measured for economiccapital and rating purposes, our empirical results indicate that conventionally based estimates are downward biased by a full order of magnitude on test portfolios. Our estimates are based on U.S. public nonfinancial firms existing between 1979 and 2004. We find strong evidence for the presence of common latent factors, even when controlling for observable factors that provide the most accurate available model of firmbyfirm default probabilities. ∗ We are grateful for financial support from Moody’s Corporation and Morgan Stanley, and for research assistance from Sabri Oncu and Vineet Bhagwat. We are also grateful for remarks from Torben Andersen, André Lucas, Richard Cantor, Stav Gaon, Tyler Shumway, and especially Michael Johannes. This revision is much improved because of suggestions by a referee, an associate editor, and Campbell Harvey. We are thankful to Moodys and to Ed Altman for generous assistance with data. Duffie is at The Graduate School of Business, Stanford University. Eckner and Horel are at Merrill Lynch. Saita is at Lehman
Diagnosis by a waiter and a mars explorer
 In Invited paper for Proceedings of the IEEE, special
, 2004
"... This paper shows how stateoftheart state estimation techniques can be used to provide efficient solutions to the difficult problem of realtime diagnosis in mobile robots. The power of the adopted estimation techniques resides in our ability to combine particle filters with classical algorithms, ..."
Abstract

Cited by 26 (3 self)
 Add to MetaCart
This paper shows how stateoftheart state estimation techniques can be used to provide efficient solutions to the difficult problem of realtime diagnosis in mobile robots. The power of the adopted estimation techniques resides in our ability to combine particle filters with classical algorithms, such as Kalman filters. We demonstrate these techniques in two scenarios: a mobile waiter robot and planetary rovers designed by NASA for Mars exploration. Keywords—Diagnosis, Rao–Blackwellized particle filtering, robotics, state estimation. I.
FINDING STRUCTURE WITH RANDOMNESS: STOCHASTIC ALGORITHMS FOR CONSTRUCTING APPROXIMATE MATRIX DECOMPOSITIONS
, 2009
"... Lowrank matrix approximations, such as the truncated singular value decomposition and the rankrevealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys recent research which demonstrates that randomization offers a powerful tool for performing l ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
Lowrank matrix approximations, such as the truncated singular value decomposition and the rankrevealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys recent research which demonstrates that randomization offers a powerful tool for performing lowrank matrix approximation. These techniques exploit modern computational architectures more fully than classical methods and open the possibility of dealing with truly massive data sets. In particular, these techniques offer a route toward principal component analysis (PCA) for petascale data. This paper presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions. These methods use random sampling to identify a subspace that captures most of the action of a matrix. The input matrix is then compressed—either explicitly or implicitly—to this subspace, and the reduced matrix is manipulated deterministically to obtain the desired lowrank factorization. In many cases, this approach beats its classical competitors in terms of accuracy, speed, and robustness. These claims are supported by extensive numerical experiments and a detailed error analysis. The specific benefits of randomized techniques depend on the computational environment. Consider
A probabilistic particle control approach to optimal, robust predictive control
 In Proceedings of the AIAA Guidance, Navigation and Control Conference
, 2006
"... Autonomous vehicles need to be able to plan trajectories to a specified goal that avoid obstacles, and are robust to the inherent uncertainty in the problem. This uncertainty arises due to uncertain state estimation, disturbances and modeling errors. Previous solutions to the robust path planning pr ..."
Abstract

Cited by 21 (7 self)
 Add to MetaCart
Autonomous vehicles need to be able to plan trajectories to a specified goal that avoid obstacles, and are robust to the inherent uncertainty in the problem. This uncertainty arises due to uncertain state estimation, disturbances and modeling errors. Previous solutions to the robust path planning problem solved this problem using a finite horizon optimal stochastic control approach. This approach finds the optimal path subject to chance constraints, which ensure that the probability of collision with obstacles is below a given threshold. This approach is limited to problems where all uncertain distributions are Gaussian, and typically result in highly conservative plans. In many cases, however, the Gaussian assumption is invalid; for example in the case of localization, the belief state about a vehicle’s position can consist of highly nonGaussian, even multimodal, distributions. In this paper we present a novel method for finite horizon stochastic control of dynamic systems subject to chance constraints. The method approximates the distribution of the system state using a finite number of particles. By expressing these particles in terms of the control variables, we are able to approximate the original stochastic control problem as a deterministic one; furthermore the approximation becomes exact as the number of particles tends to infinity. For a general class of chance constrained problems with linear system dynamics, we show that the approximate problem can be solved using efficient MixedInteger Linear Programming techniques. We apply the new method to aircraft control in turbulence, and show simulation results that demonstrate the efficacy of the approach. I.