Results 1 
5 of
5
Robust Monte Carlo Localization for Mobile Robots
, 2001
"... Mobile robot localization is the problem of determining a robot's pose from sensor data. This article presents a family of probabilistic localization algorithms known as Monte Carlo Localization (MCL). MCL algorithms represent a robot's belief by a set of weighted hypotheses (samples), whi ..."
Abstract

Cited by 826 (88 self)
 Add to MetaCart
Mobile robot localization is the problem of determining a robot's pose from sensor data. This article presents a family of probabilistic localization algorithms known as Monte Carlo Localization (MCL). MCL algorithms represent a robot's belief by a set of weighted hypotheses (samples), which approximate the posterior under a common Bayesian formulation of the localization problem. Building on the basic MCL algorithm, this article develops a more robust algorithm called MixtureMCL, which integrates two complimentary ways of generating samples in the estimation. To apply this algorithm to mobile robots equipped with range finders, a kernel density tree is learned that permits fast sampling. Systematic empirical results illustrate the robustness and computational efficiency of the approach.
2002: Ensemble Kalman filters: Sequential importance resampling and beyond
 Proc. ECMWF Workshop on the Role of the Upper Ocean in Medium and Extended Range Forecasting, Reading, United Kingdom, ECMWF
"... Data assimilation in highresolution atmosphere or ocean models is complicated because of the nonlinearity of the problem. Several methods to solve the problem have been presented, all having their own advantages and disadvantages. In this paper socalled particle methods are discussed, with emphas ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Data assimilation in highresolution atmosphere or ocean models is complicated because of the nonlinearity of the problem. Several methods to solve the problem have been presented, all having their own advantages and disadvantages. In this paper socalled particle methods are discussed, with emphasis on Sequential Importance Resampling (SIR) and a new variant of that method. Reference is made to related methods, in particular to the Ensemble Kalman filter (EnKF). A detailed comparison between the EnKF and the SIR is made using the nonlinear KdV equation. It is shown that SIR produces good results in a highly nonlinear multilayer quasigeostrophic ocean model. Since the method needs at least 500 ensemble members or particles with the presentday observing system, new variants have to be studied to reduce that number in order to make the method feasible for real applications, like (seasonal) weather forecasting. In the new variant discussed here the number of members can be reduced by at least a factor 10 by guiding the ensemble to future observations. In this way the method starts to resemble a smoother. It is shown that the new method gives promising results, and the potentials and drawbacks of the new method are discussed. 1
If
"... In this paper, we propose an original approach to the solution of Fredholm equations of the second kind. We interpret the standard von Neumann expansion of the solution as an expectation with respect to a probability distribution defined on a union of subspaces of variable dimension. Based on this r ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we propose an original approach to the solution of Fredholm equations of the second kind. We interpret the standard von Neumann expansion of the solution as an expectation with respect to a probability distribution defined on a union of subspaces of variable dimension. Based on this representation, it is possible to use transdimensional Markov Chain Monte Carlo (MCMC) methods such as Reversible Jump MCMC to approximate the solution numerically. This can be an attractive alternative to standard Sequential Importance Sampling (SIS) methods routinely used in this context. To motivate our approach, we sketch an application to value function estimation for a Markov decision process. Two computational examples are also provided.
Statistics and Computing manuscript No.
"... (will be inserted by the editor) Optimal SIR algorithm vs. fully adapted auxiliary particle filter: a non asymptotical analysis ..."
Abstract
 Add to MetaCart
(Show Context)
(will be inserted by the editor) Optimal SIR algorithm vs. fully adapted auxiliary particle filter: a non asymptotical analysis
Title of Document: DESIGNING ROBUST COLLABORATIVE SERVICES IN DISTRIBUTED WIRELESS
"... Wireless Sensor Networks (WSNs) are a popular class of distributed collaborative networks finding suitability from medical to military applications. However, their vulnerability to capture, their “open ” wireless interfaces, limited battery life, all result in potential vulnerabilities. WSNbased se ..."
Abstract
 Add to MetaCart
(Show Context)
Wireless Sensor Networks (WSNs) are a popular class of distributed collaborative networks finding suitability from medical to military applications. However, their vulnerability to capture, their “open ” wireless interfaces, limited battery life, all result in potential vulnerabilities. WSNbased services inherit these vulnerabilities. We focus on tactical environments where sensor nodes play complex roles in data sensing, aggregation and decision making. Services in such environments demand a high level of reliability and robustness. The first problem we studied is robust target localization. Location information is important for surveillance, monitoring, secure routing, intrusion detection, ondemand services etc. Target localization means tracing the path of moving entities through some known surveillance area. In a tactical environment, an adversary can often capture nodes and supply incorrect surveillance data to the system. In this thesis we create a target localization protocol that is robust against large amounts of such falsified data. Location estimates are generated by a Bayesian maximumlikelihood estimator. In order to achieve improved results with respect to