Results 11  20
of
25
Information theory, evolutionary computation, and Dembski’s “complex specified information”’, Synthese 128(2): 237–270
, 2011
"... Intelligent design advocate William Dembski has introduced a measure of information called “complex specified information”, or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a “Law of Conservation of Information” which states that chance and natural laws ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Intelligent design advocate William Dembski has introduced a measure of information called “complex specified information”, or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a “Law of Conservation of Information” which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and concludes that neither have natural explanations. In this paper we examine Dembski’s claims, point out significant errors in his reasoning, and conclude that there is no reason to accept his assertions. 1
Kolmogorov Complexity, Circuits, and the Strength of Formal Theories of Arithmetic
"... Can complexity classes be characterized in terms of efficient reducibility to the (undecidable) set of Kolmogorovrandom strings? Although this might seem improbable, a series of papers has recently provided evidence that this may be the case. In particular, it is known that there is a class of prob ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Can complexity classes be characterized in terms of efficient reducibility to the (undecidable) set of Kolmogorovrandom strings? Although this might seem improbable, a series of papers has recently provided evidence that this may be the case. In particular, it is known that there is a class of problems C defined in terms of polynomialtime truthtable reducibility to RK (the set of Kolmogorovrandom strings) that lies between BPP and PSPACE [4, 3]. In this paper, we investigate improving this upper bound from PSPACE to PSPACE ∩ P/poly. More precisely, we present a collection of true statements in the language of arithmetic, (each provable in ZF) and show that if these statements can be proved in certain extensions of Peano arithmetic, then BPP ⊆C⊆PSPACE ∩ P/poly. We conjecture that C is equal to P, and discuss the possibility this might be an avenue for trying to prove the equality of BPP and P.
Lisp ProgramSize Complexity II
, 1992
"... We present the informationtheoretic incompleteness theorems that arise in a theory of programsize complexity based on something close to real LISP. The complexity of a formal axiomatic system is defined to be the minimum size in characters of a LISP definition of the proofchecking function associa ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We present the informationtheoretic incompleteness theorems that arise in a theory of programsize complexity based on something close to real LISP. The complexity of a formal axiomatic system is defined to be the minimum size in characters of a LISP definition of the proofchecking function associated with the formal system. Using this concrete and easy to understand definition, we show (a) that it is difficult to exhibit complex Sexpressions, and (b) that it is difficult to determine the bits of the LISP halting probability\Omega LISP . We also construct improved versions\Omega 0 LISP and\Omega 00 LISP of the LISP halting probability that asymptotically have maximum possible LISP complexity. Copyright c fl 1992, Elsevier Science Publishing Co., Inc., reprinted by permission. 2 G. J. Chaitin 1. Introduction The main incompleteness theorems of myAlgorithmic Information Theory monograph [1] are reformulated and proved here using a concrete and easytounderstand definition ...
A SYNTHESIS AND A PRACTICAL APPROACH TO COMPLEX SYSTEMS
"... Abstract This document is both a synthesis of current notions about complex systems, and a practical approach description. A disambiguation is proposed and exposes possible reasons for controversies related to causation and emergence. Theoretical considerations about simulations are presented. A ju ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract This document is both a synthesis of current notions about complex systems, and a practical approach description. A disambiguation is proposed and exposes possible reasons for controversies related to causation and emergence. Theoretical considerations about simulations are presented. A justification is then given for the development of practical tools and techniques for the investigation of complex systems. A methodology for the usage of these tools is finally suggested, illustrated by application examples.
A Note On Monte Carlo Primality Tests And Algorithmic Information Theory
, 1978
"... Solovay and Strassen, and Miller and Rabin have discovered fast algorithms for testing primality which use coinflipping and whose con1 2 G. J. Chaitin clusions are only probably correct. On the other hand, algorithmic information theory provides a precise mathematical definition of the notion of ra ..."
Abstract
 Add to MetaCart
Solovay and Strassen, and Miller and Rabin have discovered fast algorithms for testing primality which use coinflipping and whose con1 2 G. J. Chaitin clusions are only probably correct. On the other hand, algorithmic information theory provides a precise mathematical definition of the notion of random or patternless sequence. In this paper we shall describe conditions under which if the sequence of coin tosses in the Solovay Strassen and MillerRabin algorithms is replaced by a sequence of heads and tails that is of maximal algorithmic information content, i.e., has maximal algorithmic randomness, then one obtains an errorfree test for primality. These results are only of theoretical interest, since it is a manifestation of the Godel incompleteness phenomenon that it is impossible to "certify" a sequence to be random by means of a proof, even though most sequences have this property. Thus by using certified random sequences one can in principle, but not in practice, convert proba...
References
, 2008
"... Chaitin’s “heuristic principle”, the theorems of a finitelyspecified theory cannot be significantly more complex than the theory itself was proved for an appropriate measure of complexity in [1]. The measure δ is a computable variation of the programsize complexity H: δ(x) = H(x) − x. The theo ..."
Abstract
 Add to MetaCart
Chaitin’s “heuristic principle”, the theorems of a finitelyspecified theory cannot be significantly more complex than the theory itself was proved for an appropriate measure of complexity in [1]. The measure δ is a computable variation of the programsize complexity H: δ(x) = H(x) − x. The theorems of a finitelyspecified, sound, consistent theory which is strong enough to include arithmetic have bounded δcomplexity, hence every sentence of the theory which is significantly more complex than the theory is unprovable. More precisely, according to Theorem 4.6 in [1], for any finitelyspecified, sound, consistent theory strong enough to formalize arithmetic (like ZermeloFraenkel set theory with choice or Peano Arithmetic) and for any Gödel numbering g of its wellformed formulae, we can compute a bound N such that no sentence x with complexity δg(x)> N can be proved in the theory; this phenomenon is independent on the choice of the Gödel numbering. Question 1. Find other natural measures of complexity for which Chaitin’s “heuristic principle ” holds true.
Aesthetic complexity
, 2008
"... Aesthetics, among other criteria, can be statistically examined in terms of the complexity required for creating and decrypting a work of art. We propose three laws of aesthetic complexity. According to the first law of aesthetic complexity, too condensed encoding makes a decryption of a work of art ..."
Abstract
 Add to MetaCart
Aesthetics, among other criteria, can be statistically examined in terms of the complexity required for creating and decrypting a work of art. We propose three laws of aesthetic complexity. According to the first law of aesthetic complexity, too condensed encoding makes a decryption of a work of art impossible and is perceived as chaotic by the untrained mind, whereas too regular structures are perceived as monotonous, too orderly and not very stimulating. Thus a necessary condition for an artistic form or design to appear appealing is its complexity to lie within a bracket between monotony and chaos. According to the second law of aesthetic complexity, due to human predisposition, this bracket is invariably based on natural forms; with rather limited plasticity. The third law of aesthetic complexity states that aesthetic complexity trends are dominated by the available resources, and thus also by cost and scarcity.
École Normale Supérieure de Lyon, France Fundamental Computer Science Master, First Year Acceptable Complexity Measures of Theorems
, 2008
"... In 1930, Gödel [7] presented in Königsberg his famous Incompleteness Theorem, stating that some true mathematical statements are unprovable. Yet, this result gives us no idea about those independent (that is, true and unprovable) statements, about their frequency, the reason they are unprovable, and ..."
Abstract
 Add to MetaCart
In 1930, Gödel [7] presented in Königsberg his famous Incompleteness Theorem, stating that some true mathematical statements are unprovable. Yet, this result gives us no idea about those independent (that is, true and unprovable) statements, about their frequency, the reason they are unprovable, and so on. Calude and Jürgensen [4] proved in 2005 Chaitin's heuristic principle for an appropriate measure: the theorems of a nitelyspeci ed theory cannot be signi cantly more complex than the theory itself (see [5]). In this work, we investigate the existence of other measures, di erent from the original one, which satisfy this heuristic principle. At this end, we introduce the de nition of acceptable complexity measure of theorems. Résumé En 1930, Gödel [7] présente à Königsberg son célèbre Théorème d'Incomplétude, spéci ant que certaines a rmations mathématiques sont indémontrables. Cependant, ce résultat ne nous donne aucune indication à propos de ces a rmations indépendantes (c'estàdire vraies mais indémontrables), sur leur fréquence, les raisons de leur indémontrabilité, etc. Calude and Jürgensen [4] ont prouvé en 2005 le principe heuristique de Chaitin pour une mesure de complexité appropriée: les théorèmes d'une théorie niment axiomatisable ne peuvent être signi cativement plus complexes que la théorie ellemême (cf [5]). Dans ce rapport, nous étudions l'existence d'autres mesures, di érentes de la mesure originale utilisée dans [4], qui satisfassent ce principe heuristique. A cette n, nous introduisons la dé nition de mesure acceptable de complexité des théorèmes.
Computational Information Gain and Inference
"... Abstract. A definition of computational information gain is presented based on Levin descriptional complexity. The measure is applicable to different inference processes, either deductive or inductive, and evaluates the relative value of new inference results. 1 ..."
Abstract
 Add to MetaCart
Abstract. A definition of computational information gain is presented based on Levin descriptional complexity. The measure is applicable to different inference processes, either deductive or inductive, and evaluates the relative value of new inference results. 1
Decisional states
, 2009
"... The computational mechanics de nition of a process internal states considers optimal prediction as the criterion for clustering: the process states consist of the histories that lead to the same distribution of future events. Knowledge of the current cluster for a given past, and its associated dist ..."
Abstract
 Add to MetaCart
The computational mechanics de nition of a process internal states considers optimal prediction as the criterion for clustering: the process states consist of the histories that lead to the same distribution of future events. Knowledge of the current cluster for a given past, and its associated distribution of possible futures, is the minimal information necessary for making optimal predictions. However when given a practical problem for which a unique prediction is sought, this view o ers no distinction between futures with rather di erent consequences, incurred costs or expected utility. When each future is ponderated by its wouldbe e ect then decision theory must be used. This article explores the consequences of grouping histories into states with equivalent optimal decisions in terms of utility, rather than equivalent full distributions of futures. The transitions between these decisional states correspond to events that lead to a change of decision. The utility function encodes the a priori knowledge on a system, the decisional states represents the underlying structure matching both the intrinsic system causal states and the external information. An algorithm is provided so as to estimate the states and their transitions from data. Application examples are given for discrete process hidden state reconstruction, cellular automata ltering, and edge detection in images. 1