Results 1  10
of
4,790
Shannon Information and Kolmogorov Complexity
, 2010
"... The elementary theories of Shannon information and Kolmogorov complexity are cmpared, the extent to which they have a common purpose, and where they are fundamentally different. The focus is on: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual in ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The elementary theories of Shannon information and Kolmogorov complexity are cmpared, the extent to which they have a common purpose, and where they are fundamentally different. The focus is on: Shannon entropy versus Kolmogorov complexity, the relation of both to universal coding, Shannon mutual
Kolmogorov Complexity and Information Theory  With An Interpretation . . .
"... We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informati ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual
The information bottleneck method
 University of Illinois
, 1999
"... We define the relevant information in a signal x ∈ X as being the information that this signal provides about another signal y ∈ Y. Examples include the information that face images provide about the names of the people portrayed, or the information that speech sounds provide about the words spoken. ..."
Abstract

Cited by 545 (38 self)
 Add to MetaCart
We define the relevant information in a signal x ∈ X as being the information that this signal provides about another signal y ∈ Y. Examples include the information that face images provide about the names of the people portrayed, or the information that speech sounds provide about the words spoken
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical
Planning Algorithms
, 2004
"... This book presents a unified treatment of many different kinds of planning algorithms. The subject lies at the crossroads between robotics, control theory, artificial intelligence, algorithms, and computer graphics. The particular subjects covered include motion planning, discrete planning, planning ..."
Abstract

Cited by 1108 (51 self)
 Add to MetaCart
, planning under uncertainty, sensorbased planning, visibility, decisiontheoretic planning, game theory, information spaces, reinforcement learning, nonlinear systems, trajectory planning, nonholonomic planning, and kinodynamic planning.
Cognitive Radio: BrainEmpowered Wireless Communications
, 2005
"... Cognitive radio is viewed as a novel approach for improving the utilization of a precious natural resource: the radio electromagnetic spectrum. The cognitive radio, built on a softwaredefined radio, is defined as an intelligent wireless communication system that is aware of its environment and use ..."
Abstract

Cited by 1479 (4 self)
 Add to MetaCart
Cognitive radio is viewed as a novel approach for improving the utilization of a precious natural resource: the radio electromagnetic spectrum. The cognitive radio, built on a softwaredefined radio, is defined as an intelligent wireless communication system that is aware of its environment and uses the methodology of understandingbybuilding to learn from the environment and adapt to statistical variations in the input stimuli, with two primary objectives in mind: • highly reliable communication whenever and wherever needed; • efficient utilization of the radio spectrum. Following the discussion of interference temperature as a new metric for the quantification and management of interference, the paper addresses three fundamental cognitive tasks. 1) Radioscene analysis. 2) Channelstate estimation and predictive modeling. 3) Transmitpower control and dynamic spectrum management. This paper also discusses the emergent behavior of cognitive radio.
Near Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
, 2004
"... Suppose we are given a vector f in RN. How many linear measurements do we need to make about f to be able to recover f to within precision ɛ in the Euclidean (ℓ2) metric? Or more exactly, suppose we are interested in a class F of such objects— discrete digital signals, images, etc; how many linear m ..."
Abstract

Cited by 1513 (20 self)
 Add to MetaCart
Suppose we are given a vector f in RN. How many linear measurements do we need to make about f to be able to recover f to within precision ɛ in the Euclidean (ℓ2) metric? Or more exactly, suppose we are interested in a class F of such objects— discrete digital signals, images, etc; how many linear measurements do we need to recover objects from this class to within accuracy ɛ? This paper shows that if the objects of interest are sparse or compressible in the sense that the reordered entries of a signal f ∈ F decay like a powerlaw (or if the coefficient sequence of f in a fixed basis decays like a powerlaw), then it is possible to reconstruct f to within very high accuracy from a small number of random measurements. typical result is as follows: we rearrange the entries of f (or its coefficients in a fixed basis) in decreasing order of magnitude f  (1) ≥ f  (2) ≥... ≥ f  (N), and define the weakℓp ball as the class F of those elements whose entries obey the power decay law f  (n) ≤ C · n −1/p. We take measurements 〈f, Xk〉, k = 1,..., K, where the Xk are Ndimensional Gaussian
Some equivalences between Shannon entropy and Kolmogorov complexity
 IEEE Transactions on Information Theory
, 1978
"... that the average codeword length L,:, for the best onetoone (not necessBluy uniquely decodable) code for X is shorter than the average codeword length L,, for the best mdquely decodable code by no more thau (log2 log, n) + 3. Let Y be a random variable taking OII a fiite or countable number of val ..."
Abstract

Cited by 35 (0 self)
 Add to MetaCart
of values and having entropy H. Then it is proved that L,:,>Hlog2 (H+l)log, log2 (H+l)...6. Some relations are eatahlished amoug the Kolmogorov, Cl&in, and extension complexities. Finally it is shown that, for all computable probability distributions, the universal prefix codes associated
Minimum Message Length and Kolmogorov Complexity
 Computer Journal
, 1999
"... this paper is to describe some of the relationships among the different streams and to try to clarify some of the important differences in their assumptions and development. Other studies mentioning the relationships appear in [1, Section IV, pp. 10381039], [2, sections 5.2, 5.5] and [3, p. 465] ..."
Abstract

Cited by 125 (29 self)
 Add to MetaCart
this paper is to describe some of the relationships among the different streams and to try to clarify some of the important differences in their assumptions and development. Other studies mentioning the relationships appear in [1, Section IV, pp. 10381039], [2, sections 5.2, 5.5] and [3, p. 465]
Results 1  10
of
4,790