Results 1  10
of
13
On a role of predictor in the filtering stability
 Electron. Comm. Probab
"... Abstract. When is a nonlinear filter stable with respect to its initial condition? In spite of the recent progress, this question still lacks a complete answer in general. Currently available results indicate that stability of the filter depends on the signal ergodic properties and the observation p ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Abstract. When is a nonlinear filter stable with respect to its initial condition? In spite of the recent progress, this question still lacks a complete answer in general. Currently available results indicate that stability of the filter depends on the signal ergodic properties and the observation process regularity and may fail if either of the ingredients is ignored. In this note we address the question of stability in a particular weak sense and show that the estimates of certain functions are always stable. This is verified without dealing directly with the filtering equation and turns to be inherited from certain onestep predictor estimates. 1.
Stability of the nonlinear filter for slowly switching Markov chains. Stochastic Process
 Appl
"... Dedicated to Robert Liptser on the occasion of his 70th birthday Abstract. Exponential stability of the nonlinear filtering equation is revisited, when the signal is a finite state Markov chain. An asymptotic upper bound for the filtering error due to incorrect initial condition is derived in the ca ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Dedicated to Robert Liptser on the occasion of his 70th birthday Abstract. Exponential stability of the nonlinear filtering equation is revisited, when the signal is a finite state Markov chain. An asymptotic upper bound for the filtering error due to incorrect initial condition is derived in the case of slowly switching signal.
INTRINSIC METHODS IN FILTER STABILITY
"... Abstract. The purpose of this article is to survey some intrinsic methods for studying the stability of the nonlinear filter. By ‘intrinsic ’ we mean methods which directly exploit the fundamental representation of the filter as a conditional expectation through classical probabilistic techniques su ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. The purpose of this article is to survey some intrinsic methods for studying the stability of the nonlinear filter. By ‘intrinsic ’ we mean methods which directly exploit the fundamental representation of the filter as a conditional expectation through classical probabilistic techniques such as change of measure, martingale convergence, coupling, etc. Beside their conceptual appeal and the additional insight gained into the filter stability problem, these methods allow one to establish stability of the filter under weaker conditions compared to other methods, e.g., to go beyond strong mixing signals, to reveal connections between filter stability and classical notions of observability, and to discover links to martingale convergence and information theory. 1. Inroduction Consider a pair of random sequences (X, Y) = (Xn, Yn)n∈Z+, where the signal component Xn takes values in a Polish space 1 S and the observation component Yn takes values in R p for some p ≥ 1. The classical filtering problem is to compute the conditional distribution πn(·) = P(Xn ∈ · F Y 0,n), (1.1) where F Y k,n stands for the σalgebra of events generated by Ym, k ≤ m ≤ n (similarly, we will use below the σalgebra F X k,n generated by Xm, k ≤ m ≤ n). Once πn is found, the optimal mean square estimate of f(Xn) can be calculated as E(f(Xn)F Y ∫ 0,n) = f(x) πn(dx) for any function f with Ef(Xn)  2 < ∞. If both X and (X, Y) are Markov processes, πn satisfies a recursive filtering equation. Specifically, let Λ and ν denote the transition probability and the initial distribution of X, i.e., for A ∈ B(S) ν(A) = P(X0 ∈ A), Λ(Xn−1, A) = P(Xn ∈ AF X 0,n−1)
Discrete time nonlinear filters with informative observations are stable
 Electr. Commun. Probab
"... Abstract. The nonlinear filter associated with the discrete time signalobservation model (Xk, Yk) is known to forget its initial condition as k → ∞ regardless of the observation structure when the signal possesses sufficiently strong ergodic properties. Conversely, it stands to reason that if the ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. The nonlinear filter associated with the discrete time signalobservation model (Xk, Yk) is known to forget its initial condition as k → ∞ regardless of the observation structure when the signal possesses sufficiently strong ergodic properties. Conversely, it stands to reason that if the observations are sufficiently informative, then the nonlinear filter should forget its initial condition regardless of any properties of the signal. We show that for observations of additive type Yk = h(Xk) + ξk with invertible observation function h (under mild regularity assumptions on h and on the distribution of the noise ξk), the filter is indeed stable in a weak sense without any assumptions at all on the signal process. If the signal satisfies a uniform continuity assumption, weak stability can be strengthened to stability in total variation. 1.
STABILITY OF NONLINEAR FILTERS: A SURVEY
"... Abstract. Filtering deals with the optimal estimation of signals from their noisy observations. The standard setting consists of a pair of random processes (X, Y) = (Xt, Yt)t≥0, where the signal component X is to be estimated at a current time t> 0 on the basis of the trajectory of Y, observed up t ..."
Abstract
 Add to MetaCart
Abstract. Filtering deals with the optimal estimation of signals from their noisy observations. The standard setting consists of a pair of random processes (X, Y) = (Xt, Yt)t≥0, where the signal component X is to be estimated at a current time t> 0 on the basis of the trajectory of Y, observed up to t. Under the minimal mean square error criterion, the optimal estimate of Xt is the conditional expectation E(XtY[0,t]). If both X and (X, Y) are Markov processes, then the conditional distribution πt(A) = P (Xt ∈ AY[0,t]), A ⊆ R satisfies a recursive equation, called filter, which realizes the optimal fusion of the a priori statistical knowledge about the signal and the a posteriori information borne by the observation path. The filtering equation is to be initialized by the probability distribution ν of the signal at time t = 0. Suppose ν is unknown and another reasonable probability distribution ¯ν is used to start the filter. As the corresponding solution ¯πt(·) differs from the optimal πt(·), the natural question of stability arises: what are the conditions in terms of the signal/observation parameters to guarantee limt→ ∞ ‖πt − ¯πt ‖ = 0 in an appropriate sense? The article discusses the recent progress in solving this stability problem, which turns
WHAT IS ALWAYS STABLE IN NONLINEAR FILTERING?
, 2005
"... Abstract. This note addresses certain stability properties of the nonlinear filtering equation in discrete time. The available positive and negative results indicate that much depends on the structure of the signal state space, its ergodic properties and observations regularity. We show that certain ..."
Abstract
 Add to MetaCart
Abstract. This note addresses certain stability properties of the nonlinear filtering equation in discrete time. The available positive and negative results indicate that much depends on the structure of the signal state space, its ergodic properties and observations regularity. We show that certain predicting estimates are stable under surprisingly general assumptions. 1.
Forgetting of the initial distribution for Hidden Markov Models 1
, 2007
"... The forgetting of the initial distribution for discrete Hidden Markov Models (HMM) is addressed: a new set of conditions is proposed, to establish the forgetting property of the filter, at a polynomial and geometric rate. Both a pathwisetype convergence of the total variation distance of the filter ..."
Abstract
 Add to MetaCart
The forgetting of the initial distribution for discrete Hidden Markov Models (HMM) is addressed: a new set of conditions is proposed, to establish the forgetting property of the filter, at a polynomial and geometric rate. Both a pathwisetype convergence of the total variation distance of the filter started from two different initial distributions, and a convergence in expectation are considered. The results are illustrated using different HMM of interest: the dynamic tobit model, the nonlinear state space model and the stochastic volatility model.
Submitted to the Annals of Applied Probability FORGETTING OF THE INITIAL DISTRIBUTION FOR NONERGODIC HIDDEN MARKOV CHAINS
, 810
"... In this paper, the forgetting of the initial distribution for a nonergodic Hidden Markov Models (HMM) is studied. A new set of conditions is proposed to establish the forgetting property of the filter, which significantly extends all the existing results. Both a pathwisetype convergence of the total ..."
Abstract
 Add to MetaCart
In this paper, the forgetting of the initial distribution for a nonergodic Hidden Markov Models (HMM) is studied. A new set of conditions is proposed to establish the forgetting property of the filter, which significantly extends all the existing results. Both a pathwisetype convergence of the total variation distance of the filter started from two different initial distributions, and a convergence in expectation are considered. The results are illustrated using generic models of nonergodic HMM and extend all the results known so far. 1. Introduction and notations. A Hidden Markov Model (HMM) is a doubly stochastic process with an underlying Markov chain that is not directly observable. More specifically, let X and Y be two spaces equipped with countably generated σfields X and Y; denote by Q and G respectively, a Markov transition kernel on (X, X) and a transition kernel from (X, X) to
J Math Imaging Vis (2007) 28: 1–18 DOI 10.1007/s1085100700078 Interacting and Annealing Particle Filters: Mathematics and a Recipe for Applications
, 2007
"... Abstract Interacting and annealing are two powerful strategies that are applied in different areas of stochastic modelling and data analysis. Interacting particle systems approximate a distribution of interest by a finite number of particles where the particles interact between the time steps. In co ..."
Abstract
 Add to MetaCart
Abstract Interacting and annealing are two powerful strategies that are applied in different areas of stochastic modelling and data analysis. Interacting particle systems approximate a distribution of interest by a finite number of particles where the particles interact between the time steps. In computer vision, they are commonly known as particle filters. Simulated annealing, on the other hand, is a global optimization method derived from statistical mechanics. A recent heuristic approach to fuse these two techniques for motion capturing has become known as annealed particle filter. In order to analyze these techniques, we rigorously derive in this paper two algorithms with annealing properties based on the mathematical theory of interacting particle systems. Convergence results and sufficient parameter restrictions enable us to point out limitations of the annealed particle filter. Moreover, we evaluate the impact of the parameters on the performance in various experiments, including the tracking of articulated bodies from noisy measurements. Our results provide a general guidance on suitable parameter choices for different applications.