Results 1  10
of
321
An integrated Bayesian approach to layer extraction from image sequences
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2001
"... AbstractÐThis paper describes a Bayesian approach for modeling 3D scenes as a collection of approximately planar layers that are arbitrarily positioned and oriented in the scene. In contrast to much of the previous work on layerbased motion modeling, which computes layered descriptions of 2D image ..."
Abstract

Cited by 123 (20 self)
 Add to MetaCart
AbstractÐThis paper describes a Bayesian approach for modeling 3D scenes as a collection of approximately planar layers that are arbitrarily positioned and oriented in the scene. In contrast to much of the previous work on layerbased motion modeling, which computes layered descriptions of 2D image motion, our work leads to a 3D description of the scene. There are two contributions within the paper. The first is to formulate the prior assumptions about the layers and scene within a Bayesian decision making framework which is used to automatically determine the number of layers and the assignment of individual pixels to layers. The second is algorithmic. In order to achieve the optimization, a Bayesian version of RANSAC is developed with which to initialize the segmentation. Then, a generalized expectation maximization method is used to find the MAP solution. Index TermsÐLayer extraction, segmentation, stereo matching, motion estimation. 1
Modelling and interpretation of architecture from several images
"... The modelling of 3dimensional (3D) environments has become a requirement for many applications in engineering design, virtual reality, visualisation and entertainment. However the scale and complexity demanded from such models has risen to the point where the acquisition of 3D models can require a ..."
Abstract

Cited by 116 (6 self)
 Add to MetaCart
(Show Context)
The modelling of 3dimensional (3D) environments has become a requirement for many applications in engineering design, virtual reality, visualisation and entertainment. However the scale and complexity demanded from such models has risen to the point where the acquisition of 3D models can require a vast amount of specialist time and equipment. Because of this much research has been undertaken in the computer vision community into automating all or part of the process of acquiring a 3D model from a sequence of images. This thesis focuses specifically on the automatic acquisition of architectural models from short image sequences. An architectural model is defined as a set of planes corresponding to walls which contain a variety of labelled primitives such as doors and windows. As well as a label defining its type, each primitive contains parameters defining its shape and texture. The key advantage of this representation is that the model defines not only geometry and texture, but also an interpretation of the scene. This is crucial as it enables reasoning about the scene; for instance, structure and texture can be inferred in areas of the model which are unseen in any
Integrating experiential and distributional data to learn semantic representations
 Psychological Review
, 2009
"... The authors identify 2 major types of statistical data from which semantic representations can be learned. These are denoted as experiential data and distributional data. Experiential data are derived by way of experience with the physical world and comprise the sensorymotor data obtained through s ..."
Abstract

Cited by 64 (4 self)
 Add to MetaCart
The authors identify 2 major types of statistical data from which semantic representations can be learned. These are denoted as experiential data and distributional data. Experiential data are derived by way of experience with the physical world and comprise the sensorymotor data obtained through sense receptors. Distributional data, by contrast, describe the statistical distribution of words across spoken and written language. The authors claim that experiential and distributional data represent distinct data types and that each is a nontrivial source of semantic information. Their theoretical proposal is that human semantic representations are derived from an optimal statistical combination of these 2 data types. Using a Bayesian probabilistic model, they demonstrate how word meanings can be learned by treating experiential and distributional data as a single joint distribution and learning the statistical structure that underlies it. The semantic representations that are learned in this manner are measurably more realistic—as verified by comparison to a set of humanbased measures of semantic representation—than those available from either data type individually or from both sources independently. This is not a result of merely using quantitatively more data, but rather it is because experiential and distributional data are qualitatively distinct, yet intercorrelated, types of data. The semantic representations that are learned are based on statistical structures that exist both within and between the experiential and distributional data types.
The study of correlation structures of DNA sequences: a critical review. Comput Chem
, 1997
"... ..."
Bayes in the sky: Bayesian inference and model selection in cosmology
 Contemp. Phys
"... The application of Bayesian methods in cosmology and astrophysics has flourished over the past decade, spurred by data sets of increasing size and complexity. In many respects, Bayesian methods have proven to be vastly superior to more traditional statistical tools, offering the advantage of higher ..."
Abstract

Cited by 58 (6 self)
 Add to MetaCart
(Show Context)
The application of Bayesian methods in cosmology and astrophysics has flourished over the past decade, spurred by data sets of increasing size and complexity. In many respects, Bayesian methods have proven to be vastly superior to more traditional statistical tools, offering the advantage of higher efficiency and of a consistent conceptual basis for dealing with the problem of induction in the presence of uncertainty. This trend is likely to continue in the future, when the way we collect, manipulate and analyse observations and compare them with theoretical models will assume an even more central role in cosmology. This review is an introduction to Bayesian methods in cosmology and astrophysics and recent results in the field. I first present Bayesian probability theory and its conceptual underpinnings, Bayes ’ Theorem and the role of priors. I discuss the problem of parameter inference and its general solution, along with numerical techniques such as Monte Carlo Markov Chain methods. I then review the theory and application of Bayesian model comparison, discussing the notions of Bayesian evidence and effective model complexity, and how to compute and interpret those quantities. Recent developments in cosmological parameter extraction and Bayesian cosmological model building are summarized, highlighting the challenges that lie ahead.
A Bayesian approach to source separation
 in Proceedings of Independent Component Analysis Workshop
, 1999
"... The problem of source separation is by its very nature an inductive inference problem. There is not enough information to deduce the solution, so one must use any available information to infer the most probable solution. We demonstrate that source separation problems are wellsuited for the Bayesia ..."
Abstract

Cited by 55 (5 self)
 Add to MetaCart
(Show Context)
The problem of source separation is by its very nature an inductive inference problem. There is not enough information to deduce the solution, so one must use any available information to infer the most probable solution. We demonstrate that source separation problems are wellsuited for the Bayesian approach which provides a natural and logically consistent method by which one can incorporate prior knowledge to estimate the most probable solution given that knowledge. We derive the BellSejnowski ICA algorithm from first principles, i.e. Bayes ' Theorem and demonstrate how the Bayesian methodology makes explicit the underlying assumptions. We then further demonstrate the power of the Bayesian approach by deriving two separation algorithms that
On DataCentric Trust Establishment in Ephemeral Ad Hoc Networks
 IEEE CONFERENCE ON COMPUTER COMMUNICATIONS
, 2008
"... We argue that the traditional notion of trust as a relation among entities, while useful, becomes insufficient for emerging datacentric mobile ad hoc networks. In these systems, setting the data trust level equal to the trust level of the data providing entity would ignore system salient features, ..."
Abstract

Cited by 45 (7 self)
 Add to MetaCart
(Show Context)
We argue that the traditional notion of trust as a relation among entities, while useful, becomes insufficient for emerging datacentric mobile ad hoc networks. In these systems, setting the data trust level equal to the trust level of the data providing entity would ignore system salient features, rendering applications ineffective and systems inflexible. This would be even more so if their operation is ephemeral, i.e., characterized by shortlived associations in volatile environments. In this paper, we address this challenge by extending the traditional notion of trust to datacentric trust: trustworthiness attributed to nodereported data per se. We propose a framework for datacentric trust establishment: First, trust in each individual piece of data is computed; then multiple, related but possibly contradictory, data are combined; finally, their validity is inferred by a decision component based on one of several evidence evaluation techniques. We consider and evaluate an instantiation of our framework in vehicular networks as a case study. Our simulation results show that our scheme is highly resilient to attackers and converges stably to the correct decision.
Studies in astronomical time series analysis. V. Bayesian blocks, a new method to analyze structure in photon counting data, Astrophys
 J
, 1998
"... Subject headings: numerical methods – data analysis — models – Xray astronomy — γray astronomy Received; accepted Astrophysical Journal2 I describe a new timedomain algorithm for detecting localized structures (bursts), revealing pulse shapes, and generally characterizing intensity variations. Th ..."
Abstract

Cited by 34 (9 self)
 Add to MetaCart
Subject headings: numerical methods – data analysis — models – Xray astronomy — γray astronomy Received; accepted Astrophysical Journal2 I describe a new timedomain algorithm for detecting localized structures (bursts), revealing pulse shapes, and generally characterizing intensity variations. The input is raw counting data, in any of three forms: timetagged photon events (TTE), binned counts, or timetospill (TTS) data. The output is the most likely segmentation of the observation into time intervals during which the photon arrival rate is perceptibly constant – i.e. has a fixed intensity without statistically significant variations. Since the analysis is based on Bayesian statistics, I call the resulting structures Bayesian Blocks. Unlike most, this method does not stipulate time bins – instead the data themselves
Bayesian Updating of Structural Models and Reliability using Markov Chain Monte Carlo Simulation
, 2002
"... In a full Bayesian probabilistic framework for ‘‘robust’’ system identification, structural response predictions and performance reliability are updated using structural test data D by considering the predictions of a whole set of possible structural models that are weighted by their updated probab ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
In a full Bayesian probabilistic framework for ‘‘robust’’ system identification, structural response predictions and performance reliability are updated using structural test data D by considering the predictions of a whole set of possible structural models that are weighted by their updated probability. This involves integrating h(�)p(��D) over the whole parameter space, where � is a parameter vector defining each model within the set of possible models of the structure, h(�) is a model prediction of a response quantity of interest, and p(��D) is the updated probability density for �, which provides a measure of how plausible each model is given the data D. The evaluation of this integral is difficult because the dimension of the parameter space is usually too large for direct numerical integration and p(��D) is concentrated in a small region in the parameter space and only known up to a scaling constant. An adaptive Markov chain Monte Carlo simulation approach is proposed to evaluate the desired integral that is based on the MetropolisHastings algorithm and a concept similar to simulated annealing. By carrying out a series of Markov chain simulations with limiting stationary distributions equal to a sequence of intermediate probability densities that converge on p(��D), the region of concentration of p(��D) is gradually portrayed. The Markov chain samples are used to estimate the desired integral by statistical averaging. The method is illustrated using simulated dynamic test data to update the robust response variance and reliability of a momentresisting frame for two cases: one where the model is only locally identifiable based on the data and the other where it is unidentifiable.