Results 1  10
of
18,490
PERFORMANCEDRIVEN ENTROPIC INFORMATION FUSION
"... Advances in technology have resulted in acquisition and subsequent fusion of data from multiple sensors of possibly different modalities. Fusing data acquired from different sensors occurs near the front end of sensing systems and therefore can become a critical bottleneck. It is therefore crucial t ..."
Abstract
 Add to MetaCart
to quantify the performance of sensor fusion. Information fusion involves estimating and optimizing an information criterion over a transformation that maps data from from one sensor data to another. It is crucial to the task of fusion to estimate divergence to a high degree of accuracy and to quantify error
Information Theory and Statistics
, 1968
"... Entropy and relative entropy are proposed as features extracted from symbol sequences. Firstly, a proper Iterated Function System is driven by the sequence, producing a fractaMike representation (CSR) with a low computational cost. Then, two entropic measures are applied to the CSR histogram of th ..."
Abstract

Cited by 1803 (2 self)
 Add to MetaCart
Entropy and relative entropy are proposed as features extracted from symbol sequences. Firstly, a proper Iterated Function System is driven by the sequence, producing a fractaMike representation (CSR) with a low computational cost. Then, two entropic measures are applied to the CSR histogram
Entropic bounds and continual measurements
 Quantum Probability and Infinite Dimensional Analysis, Quantum Probability Series QPPQ Vol. 20 (World Scientific
, 2007
"... Some bounds on the entropic informational quantities related to a quantum continual measurement are obtained and the time dependencies of these quantities are studied. 1 ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Some bounds on the entropic informational quantities related to a quantum continual measurement are obtained and the time dependencies of these quantities are studied. 1
Entropic Priors
, 1991
"... : Entropic priors assign probabilities by combining in an inseparable way the information theoretic concept of entropy with the underlying Riemannian geometry of the hypothesis space. These priors form the cornerstone of a developing new and more objective Bayesian theory of inference. Contents 1 I ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
: Entropic priors assign probabilities by combining in an inseparable way the information theoretic concept of entropy with the underlying Riemannian geometry of the hypothesis space. These priors form the cornerstone of a developing new and more objective Bayesian theory of inference. Contents 1
Entropic Priors
, 2003
"... The method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiment ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
. For experiments that cannot be repeated the resulting “entropic prior” is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. As an example
Entropic Priors
, 1991
"... Abstract: Entropic priors assign probabilities by combining in an inseparable way the information theoretic concept of entropy with the underlying Riemannian geometry of the hypothesis space. These priors form the cornerstone of a developing new and more objective Bayesian theory of inference. ..."
Abstract
 Add to MetaCart
Abstract: Entropic priors assign probabilities by combining in an inseparable way the information theoretic concept of entropy with the underlying Riemannian geometry of the hypothesis space. These priors form the cornerstone of a developing new and more objective Bayesian theory of inference.
Entropic Dynamics
 In: Bayesian Inference and Maximum Entropy Methods in Science and Engineering
, 2002
"... I explore the possibility that the laws of physics might be laws of inference rather than laws of nature. What sort of dynamics can one derive from wellestablished rules of inference? Specifically, I ask: Given relevant information codified in the initial and the final states, what trajectory is th ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
I explore the possibility that the laws of physics might be laws of inference rather than laws of nature. What sort of dynamics can one derive from wellestablished rules of inference? Specifically, I ask: Given relevant information codified in the initial and the final states, what trajectory
Conditional Information Inequalities for Entropic and Almost Entropic Points
, 2013
"... We study conditional linear information inequalities, i.e., linear inequalities for Shannon entropy that hold for distributions whose joint entropies meet some linear constraints. We prove that some conditional information inequalities cannot be extended to any unconditional linear inequalities. Som ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. Some of these conditional inequalities hold for almost entropic points, while others do not. We also discuss some counterparts of conditional information inequalities for Kolmogorov complexity.
Results 1  10
of
18,490