Results 1  10
of
18
On the Algorithmic Nature of the World
"... We propose a test based on the theory of algorithmic complexity and an experimental evaluation of Levin’s universal distribution to identify evidence in support of or in contravention of the claim that the world is algorithmic in nature. To this end we have undertaken a statistical comparison of the ..."
Abstract

Cited by 15 (12 self)
 Add to MetaCart
We propose a test based on the theory of algorithmic complexity and an experimental evaluation of Levin’s universal distribution to identify evidence in support of or in contravention of the claim that the world is algorithmic in nature. To this end we have undertaken a statistical comparison of the frequency distributions of data from physical sources on the one hand– repositories of information such as images, data stored in a hard drive, computer programs and DNA sequences–and the frequency distributions computing devices such as Turing machines, cellular automata and Post Tag systems. Statistical correlations were found and their significance measured. 1.2
N.: Complexity and information: Measuring emergence, selforganization, and homeostasis at multiple scales
 Complexity
"... ar ..."
(Show Context)
Towards a stable definition of KolmogorovChaitin complexity
, 2008
"... Although information content is invariant up to an additive constant, the range of possible additive constants applicable to programming languages is so large that in practice it plays a major role in the actual evaluation of K(s), the KolmogorovChaitin complexity of a string s. Some attempts have ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
Although information content is invariant up to an additive constant, the range of possible additive constants applicable to programming languages is so large that in practice it plays a major role in the actual evaluation of K(s), the KolmogorovChaitin complexity of a string s. Some attempts have been made to arrive at a framework stable enough for a concrete definition of K, independent of any constant under a programming language, by appealing to the naturalness of the language in question. The aim of this paper is to present an approach to overcome the problem by looking at a set of models of computation converging in output probability distribution such that that naturalness can be inferred, thereby providing a framework for a stable definition of K under the set of convergent models of computation.
Numerical Evaluation of Algorithmic Complexity for Short Strings  A Glance into the Innermost Structure of Randomness
, 2011
"... We describe a method that combines several theoretical and experimental results to numerically approximate the algorithmic (KolmogorovChaitin) complexity of all ∑ 8 n=1 2n bit strings up to 8 bits, and some bit strings between 9 and 16 bits. This is done by an exhaustive execution of all determinis ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
We describe a method that combines several theoretical and experimental results to numerically approximate the algorithmic (KolmogorovChaitin) complexity of all ∑ 8 n=1 2n bit strings up to 8 bits, and some bit strings between 9 and 16 bits. This is done by an exhaustive execution of all deterministic 2symbol Turing machines with up to 4 states for which the halting times are known thanks to the busy beaver problem, that is 11019960576 machines. An output frequency distribution is then computed, from which the algorithmic probability is calculated and the algorithmic complexity evaluated by way of the (LevinChaitin) coding theorem.
Correspondence and Independence of Numerical Evaluations of Algorithmic Information Measures, Computability
"... We show that realvalue approximations of KolmogorovChaitin (Km) using the algorithmic Coding theorem as calculated from the output frequency of a large set of small deterministic Turing machines with up to 5 states (and 2 symbols), is in agreement with the number of instructions used by the Turing ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
(Show Context)
We show that realvalue approximations of KolmogorovChaitin (Km) using the algorithmic Coding theorem as calculated from the output frequency of a large set of small deterministic Turing machines with up to 5 states (and 2 symbols), is in agreement with the number of instructions used by the Turing machines producing s, which is consistent with strict integervalue programsize complexity. Nevertheless, Km proves to be a finergrained measure and a potential alternative approach to lossless compression algorithms for small entities, where compression fails. We also show that neither Km nor the number of instructions used shows any correlation with Bennett’s Logical Depth LD(s) other than what’s predicted by the theory. The agreement between theory and numerical calculations shows that despite the undecidability of these theoretical measures, approximations are stable and meaningful, even for small programs and for short strings. We also announce a first Beta version of an Online Algorithmic Complexity Calculator (OACC), based on a combination of theoretical concepts, as a numerical implementation of the Coding Theorem Method.
An algorithmic informationtheoretic approach to the behaviour of financial markets
, 2010
"... ..."
The World is Either Algorithmic or Mostly Random
, 2011
"... I will propose the notion that the universe is digital, not as a claim about what the universe is made of but rather about the way it unfolds. Central to the argument will be the concepts of symmetry breaking and algorithmic probability, which will be used as tools to compare the way patterns are di ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
I will propose the notion that the universe is digital, not as a claim about what the universe is made of but rather about the way it unfolds. Central to the argument will be the concepts of symmetry breaking and algorithmic probability, which will be used as tools to compare the way patterns are distributed in our world to the way patterns are distributed in a simulated digital one. These concepts will provide a framework for a discussion of the informational nature of reality. I will argue that if the universe were analog, then the world would likely be random, making it largely incomprehensible. The digital model has, however, an inherent beauty in its imposition of an upper limit and in the convergence in computational power to a maximal level of sophistication. Even if deterministic, that it is digital doesnt mean that the world is trivial or predictable, but rather that it is built up from operations that at the lowest scale are very simple but that at a higher scale look complex and even random, though only in
Do Mathematical and Social Factors Explain the Distribution of Numbers in the OEIS?
"... ©2013 by the authors. This work is licensed under a Creative Commons License. JHM is an open access biannual journal sponsored by the Claremont Center for the Mathematical ..."
Abstract
 Add to MetaCart
©2013 by the authors. This work is licensed under a Creative Commons License. JHM is an open access biannual journal sponsored by the Claremont Center for the Mathematical
Contents lists available at SciVerse ScienceDirect
"... journal homepage: www.elsevier.com/locate/amc Numerical evaluation of algorithmic complexity for short strings: A ..."
Abstract
 Add to MetaCart
(Show Context)
journal homepage: www.elsevier.com/locate/amc Numerical evaluation of algorithmic complexity for short strings: A
Sloane’s Gap: Do Mathematical and Social Factors Explain the Distribution of Numbers in the OEIS?
"... The Online Encyclopedia of Integer Sequences (OEIS) is a catalog of integer sequences. We are particularly interested in the number of occurrences of N(n) of an integer n in the database. This number N(n) marks the importance of n and it varies noticeably from one number to another, and from one num ..."
Abstract
 Add to MetaCart
The Online Encyclopedia of Integer Sequences (OEIS) is a catalog of integer sequences. We are particularly interested in the number of occurrences of N(n) of an integer n in the database. This number N(n) marks the importance of n and it varies noticeably from one number to another, and from one number to the next in a series. “Importance ” can be mathematically objective (2 10 is an example of an “important ” number in this sense) or as the result of a shared mathematical culture (10 9 is more important than 9 10 because we use a decimal notation). The concept of algorithmic complexity [6, 2, 7] (also known as Kolmogorov or KolmogorovChaitin complexity) will be used to explain the curve shape as an “objective ” measure. However, the observed curve does not conform to the curve predicted by an analysis based on algorithmic complexity because of a clear gap separating the distribution into two clouds of points. We shall call this phenomenon “Sloane’s gap”.