Results 1 
9 of
9
Binary Lambda Calculus and Combinatory Logic.” Sep 14, 2004. http://homepages. cwi.nl/ ∼ tromp/cl/LC.pdf [64] Tadaki, K. “Upper bound by Kolmogorov complexity for the probability
 in computable POVM measurement.” Proceedings of the 5th Conference on Real Numbers and Computers, RNC5
, 2003
"... In the first part, we introduce binary representations of both lambda calculus and combinatory logic terms, and demonstrate their simplicity by providing very compact parserinterpreters for these binary languages. Along the way we also present new results on list representations, bracket abstractio ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
(Show Context)
In the first part, we introduce binary representations of both lambda calculus and combinatory logic terms, and demonstrate their simplicity by providing very compact parserinterpreters for these binary languages. Along the way we also present new results on list representations, bracket abstraction, and fixpoint combinators. In the second part we review Algorithmic Information Theory, for which these interpreters provide a convenient vehicle. We demonstrate this with several concrete upper bounds on programsize complexity, including an elegant selfdelimiting code for binary strings. 1
On the Algorithmic Nature of the World
"... We propose a test based on the theory of algorithmic complexity and an experimental evaluation of Levin’s universal distribution to identify evidence in support of or in contravention of the claim that the world is algorithmic in nature. To this end we have undertaken a statistical comparison of the ..."
Abstract

Cited by 14 (11 self)
 Add to MetaCart
We propose a test based on the theory of algorithmic complexity and an experimental evaluation of Levin’s universal distribution to identify evidence in support of or in contravention of the claim that the world is algorithmic in nature. To this end we have undertaken a statistical comparison of the frequency distributions of data from physical sources on the one hand– repositories of information such as images, data stored in a hard drive, computer programs and DNA sequences–and the frequency distributions computing devices such as Turing machines, cellular automata and Post Tag systems. Statistical correlations were found and their significance measured. 1.2
Numerical Evaluation of Algorithmic Complexity for Short Strings  A Glance into the Innermost Structure of Randomness
, 2011
"... We describe a method that combines several theoretical and experimental results to numerically approximate the algorithmic (KolmogorovChaitin) complexity of all ∑ 8 n=1 2n bit strings up to 8 bits, and some bit strings between 9 and 16 bits. This is done by an exhaustive execution of all determinis ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
We describe a method that combines several theoretical and experimental results to numerically approximate the algorithmic (KolmogorovChaitin) complexity of all ∑ 8 n=1 2n bit strings up to 8 bits, and some bit strings between 9 and 16 bits. This is done by an exhaustive execution of all deterministic 2symbol Turing machines with up to 4 states for which the halting times are known thanks to the busy beaver problem, that is 11019960576 machines. An output frequency distribution is then computed, from which the algorithmic probability is calculated and the algorithmic complexity evaluated by way of the (LevinChaitin) coding theorem.
Contemporary Approaches to Artificial General Intelligence
 In Artificial General Intelligence, edited by Ben Goertzel and Cassio Pennachin
, 2007
"... ..."
(Show Context)
An algorithmic informationtheoretic approach to the behaviour of financial markets
, 2010
"... ..."
Kolmogorov Complexity in Combinatory Logic
"... Intuitively, the amount of information in a string is the size of the shortest program that outputs the string. The first billion digits of # for example, contain very little information, since they can be calculated by a C program of a few lines only. Although information content seems to be hi ..."
Abstract
 Add to MetaCart
Intuitively, the amount of information in a string is the size of the shortest program that outputs the string. The first billion digits of # for example, contain very little information, since they can be calculated by a C program of a few lines only. Although information content seems to be highly dependent on choice of programming language, the notion is actually invariant up to an additive constant.
Contents lists available at SciVerse ScienceDirect
"... journal homepage: www.elsevier.com/locate/amc Numerical evaluation of algorithmic complexity for short strings: A ..."
Abstract
 Add to MetaCart
(Show Context)
journal homepage: www.elsevier.com/locate/amc Numerical evaluation of algorithmic complexity for short strings: A