Results 1  10
of
30
Uniform test of algorithmic randomness over a general space
 Theoretical Computer Science
, 2005
"... ABSTRACT. The algorithmic theory of randomness is well developed when the underlying space is the set of finite or infinite sequences and the underlying probability distribution is the uniform distribution or a computable distribution. These restrictions seem artificial. Some progress has been made ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
ABSTRACT. The algorithmic theory of randomness is well developed when the underlying space is the set of finite or infinite sequences and the underlying probability distribution is the uniform distribution or a computable distribution. These restrictions seem artificial. Some progress has been made to extend the theory to arbitrary Bernoulli distributions (by MartinLöf), and to arbitrary distributions (by Levin). We recall the main ideas and problems of Levin’s theory, and report further progress in the same framework. The issues are the following: – Allow noncompact spaces (like the space of continuous functions, underlying the Brownian motion). – The uniform test (deficiency of randomness) dP (x) (depending both on the outcome x and the measure P) should be defined in a general and natural way. – See which of the old results survive: existence of universal tests, conservation of randomness, expression of tests in terms of description complexity, existence of a universal measure, expression of mutual information as ”deficiency of independence”. – The negative of the new randomness test is shown to be a generalization of complexity in continuous spaces; we show that the addition theorem survives. The paper’s main contribution is introducing an appropriate framework for studying these questions and related ones (like statistics for a general family of distributions). 1.
Information Distance
, 1997
"... While Kolmogorov complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the information distance between two individual objects, for example, two pictures. We give several natural definitions of a universal inf ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
While Kolmogorov complexity is the accepted absolute measure of information content in an individual finite object, a similarly absolute notion is needed for the information distance between two individual objects, for example, two pictures. We give several natural definitions of a universal information metric, based on length of shortest programs for either ordinary computations or reversible (dissipationless) computations. It turns out that these definitions are equivalent up to an additive logarithmic term. We show that the information distance is a universal cognitive similarity distance. We investigate the maximal correlation of the shortest programs involved, the maximal uncorrelation of programs (a generalization of the SlepianWolf theorem of classical information theory), and the density properties of the discrete metric spaces induced by the information distances. A related distance measures the amount of nonreversibility of a computation. Using the physical theo...
Algorithmic Theories Of Everything
, 2000
"... The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lac ..."
Abstract

Cited by 32 (15 self)
 Add to MetaCart
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff’s algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin’s Omega, the latter from Levin’s universal search and a natural resourceoriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive Pspecific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
What is Complexity?  The philosophy of complexity per se with application to some examples in evolution.
"... 2 It is argued that complexity has only a lim ited use as a paradigm against reductionist approaches and that it has a much richer potential as a comparable property. What can complexity be usefully said to be a property of is discussed. It is argued that it is unlikely to have any useful value ..."
Abstract

Cited by 32 (6 self)
 Add to MetaCart
2 It is argued that complexity has only a lim ited use as a paradigm against reductionist approaches and that it has a much richer potential as a comparable property. What can complexity be usefully said to be a property of is discussed. It is argued that it is unlikely to have any useful value as applied to "real" objects or sy stems. Further that even relativising it to an observer has problems. It is proposed that complexity can usefully be applied only to constructions within a given language. It is argued that complexity is usefully differentiated from the concepts of size, ignorance, variety , minimum description length and order. A definition of complexity is proposed which can be summarised as "that property of a language expression which makes it difficult to formulate its overall behaviour even when given almost complete information about its atomic components and their interrelations.". Some of the consequences of this definition are discussed. It is shown that...
On causally asymmetric versions of Occam’s Razor and their relation to thermodynamics
, 2007
"... and their relation to thermodynamics ..."
Information and Entropy in Quantum Theory
, 2004
"... Recent developments in quantum computing have revived interest in the notion of information as a foundational principle in physics. It has been suggested that information provides a means of interpreting quantum theory and a means of understanding the role of entropy in thermodynamics. The thesis pr ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
Recent developments in quantum computing have revived interest in the notion of information as a foundational principle in physics. It has been suggested that information provides a means of interpreting quantum theory and a means of understanding the role of entropy in thermodynamics. The thesis presents a critical examination of these ideas, and contrasts the use of Shannon information with the concept of ’active information ’ introduced by Bohm and Hiley. We look at certain thought experiments based upon the ’delayed choice ’ and ’quantum eraser’ interference experiments, which present a complementarity between information gathered from a quantum measurement and interference effects. It has been argued that these experiments show the Bohm interpretation of quantum theory is untenable. We demonstrate that these experiments depend critically upon the assumption that a quantum optics device can operate as a measuring device, and show that, in the context of these experiments, it cannot be consistently understood in this way. By contrast, we then show how the notion of ’active information ’ in the Bohm interpretation provides a coherent explanation of the phenomena shown in these experiments. We then examine the relationship between information and entropy. The thought experiment
The Boltzmann Entropy and Randomness Tests
, 1994
"... In the context of the dynamical systems of classical mechanics, we introduce two new notions called \algorithmic negrain and coarsegrain entropy". The negrain algorithmic entropy is, on the one hand, a simple variant of the randomness tests of MartinLof (and others) and is, on the oth ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
In the context of the dynamical systems of classical mechanics, we introduce two new notions called \algorithmic negrain and coarsegrain entropy". The negrain algorithmic entropy is, on the one hand, a simple variant of the randomness tests of MartinLof (and others) and is, on the other hand, a connecting link between description (Kolmogorov) complexity, Gibbs entropy and Boltzmann entropy.
Entropy of Scene Visibility
, 1999
"... We propose a new approach, based on information theory, to study the visibility of a scene. Thus, we will define the concepts of entropy and mutual information applied to 3D scene visibility. Mainly, we analize the concept of entropy (or randomness) of scene visibility and we examine the relationshi ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
We propose a new approach, based on information theory, to study the visibility of a scene. Thus, we will define the concepts of entropy and mutual information applied to 3D scene visibility. Mainly, we analize the concept of entropy (or randomness) of scene visibility and we examine the relationship between entropy of scene visibility and the expected value of the mean square error for all form factors. Next, these concepts are applied to diverse sample scenes and the accuracy of the values presented is analyzed. Key Words: Rendering, Radiosity, Monte Carlo, Information Theory, Entropy 1 Introduction In this paper, the visibility of a scene, which is directly related to form factors [1], is analyzed from the viewpoint of information theory. In many different fields, the concept of entropy has been studied at length and has been used as a starting point in order to study complexity [6, 10, 18, 19]. In our case, we study the entropy of scene visibility and leave the study of scene com...
Philosophical Issues in Kolmogorov Complexity
 In Proceedings on Automata, Languages and Programming (ICALP92
, 1992
"... this article at a conceptual level, it is sufficient to know that Kolmogorov complexity of a finite string x ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
this article at a conceptual level, it is sufficient to know that Kolmogorov complexity of a finite string x
Unpredictability, Information, and Chaos
, 1997
"... A source of unpredictability is equivalent to a source of information: unpredictability means not knowing which of a set of alternatives is the actual one; determining the actual alternative yields information. The degree of unpredictability is neatly quantified by the information measure introduced ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
A source of unpredictability is equivalent to a source of information: unpredictability means not knowing which of a set of alternatives is the actual one; determining the actual alternative yields information. The degree of unpredictability is neatly quantified by the information measure introduced by Shannon. This perspective is applied to three kinds of unpredictability in physics: the absolute unpredictability of quantum mechanics, the unpredictability of the coarsegrained future due to classical chaos, and the unpredictability of open systems. The incompatibility of the first two of these is the root of the difficulty in defining quantum chaos, whereas the unpredictability of open systems, it is suggested, can provide a unified characterization of chaos in classical and quantum dynamics.