Results 11  20
of
89
Increasing Propagation of Chaos for Mean Field Models
, 1999
"... Let ¯ (N) denote a meanfield measure with potential F . Asymptotic independence properties of the measure ¯ (N) are investigated. In particular, with H(\Deltaj¯) denoting relative entropy, if there exists a unique nondegenerate minimum of H(\Deltaj¯) \Gamma F (\Delta), then propagation of cha ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Let ¯ (N) denote a meanfield measure with potential F . Asymptotic independence properties of the measure ¯ (N) are investigated. In particular, with H(\Deltaj¯) denoting relative entropy, if there exists a unique nondegenerate minimum of H(\Deltaj¯) \Gamma F (\Delta), then propagation of chaos holds for blocks of size o(N ). Certain degenerate situations are also studied. The results are applied for the Langevin dynamics of a system of interacting particles leading to a McKeanVlasov limit. R'esum'e Soit ¯ (N) une mesure de type champmoyen avec potentiel d'interaction F . Les propri'et'es asymptotiques d'ind'ependance de la mesure ¯ (N) sont 'etudie'es. En particulier, si H(\Deltaj¯) designe l'entropie relative, on montre que, s'il existe un unique minimum non d'eg'en'er'e de H(\Deltaj¯) \Gamma F (\Delta), alors la propagation du chaos est valide pour les block de taille o(N ). Certains cas de minima d'eg'en'er'e sont aussi 'etudi'es. Les resultats sont appliqu'es `a la dy...
Genealogies and Increasing Propagations of Chaos for FeynmanKac and Genetic Models, in "Annals of Applied Probability
 n o 4, p. 11661198. Activity Report INRIA 2012
"... A pathvalued interacting particle systems model for the genealogical structure of genetic algorithms is presented. We connect the historical process and the distribution of the whole ancestral tree with a class of FeynmanKac formulae on path space. We also prove increasing and uniform versions o ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
A pathvalued interacting particle systems model for the genealogical structure of genetic algorithms is presented. We connect the historical process and the distribution of the whole ancestral tree with a class of FeynmanKac formulae on path space. We also prove increasing and uniform versions of propagation of chaos for appropriate particle block size and time horizon yielding what seems to be the first result of this type for this class of particle systems. 1. Introduction. Over
Relative Entropy and the multivariable multidimensional Moment Problem
 IEEE Trans. on Information Theory
"... Entropylike functionals on operator algebras have been studied since the pioneering work of von Neumann, Umegaki, Lindblad, and Lieb. The most wellknown are the von Neumann entropy I(ρ): = −trace(ρ log ρ) and a generalization of the KullbackLeibler distance S(ρσ): = trace(ρ log ρ − ρ log σ), re ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Entropylike functionals on operator algebras have been studied since the pioneering work of von Neumann, Umegaki, Lindblad, and Lieb. The most wellknown are the von Neumann entropy I(ρ): = −trace(ρ log ρ) and a generalization of the KullbackLeibler distance S(ρσ): = trace(ρ log ρ − ρ log σ), refered to as quantum relative entropy and used to quantify distance between states of a quantum system. The purpose of this paper is to explore I and S as regularizing functionals in seeking solutions to multivariable and multidimensional moment problems. It will be shown that extrema can be effectively constructed via a suitable homotopy. The homotopy approach leads naturally to a further generalization and a description of all the solutions to such moment problems. This is accomplished by a renormalization of a Riemannian metric induced by entropy functionals. As an application we discuss the inverse problem of describing power spectra which are consistent with secondorder statistics, which has been the main motivation behind the present work.
Outage behavior of discrete memoryless channels under channel estimation errors
 in Proc. of International Symposium on Information Theory and its Applications, ISITA 2006
, 2006
"... Classically, communication systems are designed assuming perfect channel state information at the receiver and/or transmitter. However, in many practical situations, only an estimate of the channel is available that differs from the true channel. We address this channel mismatch scenario by introduc ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
(Show Context)
Classically, communication systems are designed assuming perfect channel state information at the receiver and/or transmitter. However, in many practical situations, only an estimate of the channel is available that differs from the true channel. We address this channel mismatch scenario by introducing the notion of estimationinduced outage capacity, for which we provide an associated coding theorem and its strong converse, assuming a discrete memoryless channel. The transmitter and receiver strive to construct codes for ensuring reliable communication with a quality of service (QoS), in terms of achieving a target rate with small error probability, no matter which degree of accuracy channel estimation arises during a transmission. We illustrate our ideas via numerical simulations for transmissions over Ricean fading channels using ratelimited feedback channel and maximum likelihood (ML) channel estimation. Our results provide intuitive insights on the impact of the channel estimate and the channel characteristics (SNR, Ricean Kfactor, training sequence length, feedback rate, etc.) on the mean outage capacity. 1.
Convergence of Markov chains in information divergence
 J. Theoret. Probab
, 2009
"... Abstract. Information theoretic methods are used to prove convergence in information divergence of reversible Markov chains. Also some ergodic theorems for information divergence are proved. 1. Introduction and ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Information theoretic methods are used to prove convergence in information divergence of reversible Markov chains. Also some ergodic theorems for information divergence are proved. 1. Introduction and
Refinements of the Gibbs conditioning principle
 Prob. Theory and Related Fields 104
, 1996
"... Refinements of Sanov's large deviations theorem lead via Csisz'ar's information theoretic identity to refinements of the Gibbs conditioning principle which are valid for blocks whose length increase with the length of the conditioning sequence. Sharp bounds on the growth of the block ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Refinements of Sanov's large deviations theorem lead via Csisz'ar's information theoretic identity to refinements of the Gibbs conditioning principle which are valid for blocks whose length increase with the length of the conditioning sequence. Sharp bounds on the growth of the block length with the length of the conditioning sequence are derived. Extensions of Csisz'ar's triangle inequality and information theoretic identity to the Markov chain setup lead to similar refinements in the Markov case. 1 Introduction Throughout this paper, X 1 ; X 2 ; : : : denotes a sequence of independent, identically distributed random variables, distributed over a Polish space (\Sigma; B \Sigma ) with common distribution PX . Here, B \Sigma denotes the Borel oefield of \Sigma. Let L n = 1 n P n i=1 ffi X i denote the empirical measure of the sequence Partially supported by NSF DMS9209712 grant and by a USISRAEL BSF grant. y Partially supported by a USIsrael BSF grant and by the fund for ...
Identification via Compressed Data
, 1998
"... We introduce and analyze a new coding problem for a correlated source (X n ; Y n ) 1 n=1 . The observer of X n can transmit data depending on X n at a prescribed rate R. Based on these data the observer of Y n tries to identify whether for some distortion measure ae (like the Hamming dis ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
We introduce and analyze a new coding problem for a correlated source (X n ; Y n ) 1 n=1 . The observer of X n can transmit data depending on X n at a prescribed rate R. Based on these data the observer of Y n tries to identify whether for some distortion measure ae (like the Hamming distance) 1 n ae(X n ; Y n ) d, a prescribed fidelity criterion. We investigate as functions of R and d the exponents of two error probabilities, the probabilities for misacceptance and the probabilities for misrejection. Our analysis has led to a new method for proving converses. Its basis is "The Inherently Typical Subset Lemma". It goes considerably beyond the "Entropy Characterisation" of [2], the "Image Size Characterisation" of [3], and its extensions in [5]. It is conceivable that it has a strong impact on Multiuser Information Theory. Key words: Correlated source, identification with fidelity, misacceptance and misrejection error probabilities. I Introduction and Formulation ...
Asymptotic Normality of the Posterior in Relative Entropy
 IEEE Trans. Inform. Theory
, 1999
"... We show that the relative entropy between a posterior density formed from a smooth likelihood and prior and a limiting normal form tends to zero in the independent and identically distributed case. The mode of convergence is in probability and in mean. Applications to codelengths in stochastic compl ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We show that the relative entropy between a posterior density formed from a smooth likelihood and prior and a limiting normal form tends to zero in the independent and identically distributed case. The mode of convergence is in probability and in mean. Applications to codelengths in stochastic complexity and to sample size selection are briey discussed. Index Terms: Posterior density, asymptotic normality, relative entropy. Revision submitted to Trans. Inform Theory , 22 May 1998. This research was partially supported by NSERC Operating Grant 554891. The author is with the Department of Statistics, University of British Columbia, Room 333, 6356 Agricultural Road, Vancouver, BC, Canada V6T 1Z2. 1 I.
FROM A LARGEDEVIATIONS PRINCIPLE TO THE WASSERSTEIN GRADIENT FLOW: A NEW MICROMACRO PASSAGE
, 2010
"... We study the connection between a system of many independent Brownian particles on one hand and the deterministic diffusion equation on the other. For a fixed time step h> 0, a largedeviations rate functional Jh characterizes the behaviour of the particle system at t = h in terms of the initial ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
We study the connection between a system of many independent Brownian particles on one hand and the deterministic diffusion equation on the other. For a fixed time step h> 0, a largedeviations rate functional Jh characterizes the behaviour of the particle system at t = h in terms of the initial distribution at t = 0. For the diffusion equation, a single step in the timediscretized entropyWasserstein gradient flow is characterized by the minimization of a functional Kh. We establish a new connection between these systems by proving that Jh and Kh are equal up to second order in h as h → 0. This result gives a microscopic explanation of the origin of the entropyWasserstein gradient flow formulation of the diffusion equation. Simultaneously, the limit passage presented here gives a physically natural description of the underlying particle system by describing it as an entropic gradient flow.