Results 1  10
of
15
Exponential Stability for Nonlinear Filtering
, 1996
"... We study the a.s. exponential stability of the optimal filter w.r.t. its initial conditions. A bound is provided on the exponential rate (equivalently, on the memory length of the filter) for a general setting both in discrete and in continuous time, in terms of Birkhoff's contraction coefficient. C ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
We study the a.s. exponential stability of the optimal filter w.r.t. its initial conditions. A bound is provided on the exponential rate (equivalently, on the memory length of the filter) for a general setting both in discrete and in continuous time, in terms of Birkhoff's contraction coefficient. Criteria for exponential stability and explicit bounds on the rate are given in the specific cases of a diffusion process on a compact manifold, and discrete time Markov chains on both continuous and discretecountable state spaces. R'esum'e Nous 'etudions la stabilit'e du filtre optimal par raport `a ses conditions initiales. Le taux de d'ecroissance exponentielle est calcul'e dans un cadre g'en'eral, pour temps discret et temps continu, en terme du coefficient de contraction de Birkhoff. Des crit`eres de stabilit'e exponentielle et des bornes explicites sur le taux sont calcul'ees pour les cas particuliers d'une diffusion sur une vari'ete compacte, ainsi que pour des chaines de Markov sur ...
Analyticity of entropy rate of a hidden Markov chain
 In Proc. of IEEE International Symposium on Information Theory, Adelaide, Australia, September 4September 9 2005
, 1995
"... We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for t ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
We prove that under mild positivity assumptions the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. A general principle to determine the domain of analyticity is stated. An example is given to estimate the radius of convergence for the entropy rate. We then show that the positivity assumptions can be relaxed, and examples are given for the relaxed conditions. We study a special class of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol, and we give necessary and sufficient conditions for analyticity of the entropy rate for this case. Finally, we show that under the positivity assumptions the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters. 1
Unsolved problems concerning random walks on trees
 Classical and Modern Branching Processes, K. Athreya and P. Jagers (editors
, 1997
"... Abstract. We state some unsolved problems and describe relevant examples concerning random walks on trees. Most of the problems involve the behavior of random walks with drift: e.g., is the speed on GaltonWatson trees monotonic in the drift parameter? These random walks have been used in MonteCarl ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Abstract. We state some unsolved problems and describe relevant examples concerning random walks on trees. Most of the problems involve the behavior of random walks with drift: e.g., is the speed on GaltonWatson trees monotonic in the drift parameter? These random walks have been used in MonteCarlo algorithms for sampling from the vertices of a tree; in general, their behavior reflects the size and regularity of the underlying tree. Random walks are related to conductance. The distribution function for the conductance of GaltonWatson trees satisfies an interesting functional equation; is this distribution function absolutely continuous? §1. Introduction. To explore the structure of irregular trees, we consider nearestneighbor random walks on them. The behavior of simple random walk gives some information about the structure, but more can be gleaned by considering the oneparameter family of random walks RWλ described below. That is, the behavior of such random walks on spherically symmetric
Series Expansions of Lyapunov Exponents and Forgetful Monoids
, 2000
"... We consider Lyapunov exponents of random iterates of monotone homogeneous maps. We assume that the images of some iterates are lines, with positive probability. Using this memoryloss property which holds generically for random products of matrices over the maxplus semiring, and in particular, for ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We consider Lyapunov exponents of random iterates of monotone homogeneous maps. We assume that the images of some iterates are lines, with positive probability. Using this memoryloss property which holds generically for random products of matrices over the maxplus semiring, and in particular, for Tetrislike heaps of pieces models, we give a series expansion formula for the Lyapunov exponent, as a function of the probability law. In the case of rational probability laws, we show that the Lyapunov exponent is an analytic function of the parameters of the law, in a domain that contains the absolute convergence domain of a partition function associated to a special "forgetful" monoid, defined by generators and relations.
Derivatives of Entropy Rate in Special Families of Hidden Markov Chains
 Issue 7, July 2007, Page(s):2642
"... Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13, 14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivative ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13, 14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate actually stabilize at an explicit finite time. We generalize this result to a natural class of hidden Markov chains called “Black Holes. ” We also discuss in depth special cases of binary Markov chains observed in binary symmetric noise, and give an abstract formula for the first derivative in terms of a measure on the simplex due to Blackwell. 1
Analytic Expansions of (max,+) Lyapunov Exponents
, 1998
"... We give an explicit analytic series expansion of the (max; +)Lyapunov exponent fl(p) of a sequence of independent and identically distributed random matrices in this algebra, generated via a Bernoulli scheme depending on a small parameter p. A key assumption is that one of the matrices has a unique ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
We give an explicit analytic series expansion of the (max; +)Lyapunov exponent fl(p) of a sequence of independent and identically distributed random matrices in this algebra, generated via a Bernoulli scheme depending on a small parameter p. A key assumption is that one of the matrices has a unique eigenvector. This allows us to use a representation of this exponent as the mean value of a certain random variable, and then a discrete analogue of the socalled lighttraffic perturbation formulas to derive the expansion. We show that it is analytic under a simple condition on p. This also provides a closed form expression for all derivatives of fl(p) at p = 0 and approximations of fl(p) of any order, together with an error estimate for nite order Taylor approximations. Several extensions of this are discussed, including expansions of multinomial schemes depending on small parameters (p 1, ..., p m ) and expansions for exponents associated with iterates of a class of random operators...
Asymptotics of the inputconstrained binary symmetric channel capacity
 Annals of Applied Probability
, 2009
"... We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
We study the classical problem of noisy constrained capacity in the case of the binary symmetric channel (BSC), namely, the capacity of a BSC whose inputs are sequences chosen from a constrained set. Motivated by a result of Ordentlich and Weissman [In Proceedings of IEEE Information Theory Workshop (2004) 117–122], we derive an asymptotic formula (when the noise parameter is small) for the entropy rate of a hidden Markov chain, observed when a Markov chain passes through a BSC. Using this result, we establish an asymptotic formula for the capacity of a BSC with input process supported on an irreducible finite type constraint, as the noise parameter tends to zero. 1. Introduction and background. Let X,Y be discrete random variables with alphabet X,Y and joint probability mass function pX,Y (x,y) △ = P(X = x,Y = y), x ∈ X,y ∈ Y [for notational simplicity, we will write p(x,y) rather than pX,Y (x,y), similarly p(x),p(y) rather than pX(x),pY (y), resp., when it
Analyticity of Entropy Rate in Families of Hidden Markov Chains, submitted to
 IEEE Tran. Inf. Th
, 2005
"... We prove that under a mild positivity assumption the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. We give examples to show how this can fail in some cases. And we study two natural special classes of hidden Markov chains in more d ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We prove that under a mild positivity assumption the entropy rate of a hidden Markov chain varies analytically as a function of the underlying Markov chain parameters. We give examples to show how this can fail in some cases. And we study two natural special classes of hidden Markov chains in more detail: binary hidden Markov chains with an unambiguous symbol and binary Markov chains corrupted by binary symmetric noise. Finally, we show that under the positivity assumption the hidden Markov chain itself varies analytically, in a strong sense, as a function of the underlying Markov chain parameters. 1
Onedimensional finite range random walk in random medium and
, 2007
"... invariant measure equation ..."
Analyticity, Convergence and Convergence Rate of Recursive Maximum Likelihood Estimation in Hidden Markov Models
, 2009
"... This paper considers the asymptotic behavior of the recursive maximum likelihood estimation in hidden Markov models. The paper is focused on the analytic properties of the asymptotic loglikelihood and on the pointconvergence and convergence rate of the recursive maximum likelihood estimator. Using ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
This paper considers the asymptotic behavior of the recursive maximum likelihood estimation in hidden Markov models. The paper is focused on the analytic properties of the asymptotic loglikelihood and on the pointconvergence and convergence rate of the recursive maximum likelihood estimator. Using the principle of analytical continuation, the analyticity of the asymptotic loglikelihood is shown for analytically parameterized hidden Markov models. Relying on this fact and some results from differential geometry (Lojasiewicz inequality), the almost sure pointconvergence of the recursive maximum likelihood algorithm is demonstrated, and relatively tight bounds on the convergence rate are derived. As opposed to the existing result on the asymptotic behavior of maximum likelihood estimation in hidden Markov models, the results of this paper are obtained without assuming that the loglikelihood function has an isolated maximum at which the Hessian is strictly negative definite.