Results 1  10
of
10
Hierarchies Of Generalized Kolmogorov Complexities And Nonenumerable Universal Measures Computable In The Limit
 INTERNATIONAL JOURNAL OF FOUNDATIONS OF COMPUTER SCIENCE
, 2000
"... The traditional theory of Kolmogorov complexity and algorithmic probability focuses on monotone Turing machines with oneway writeonly output tape. This naturally leads to the universal enumerable SolomonoLevin measure. Here we introduce more general, nonenumerable but cumulatively enumerable m ..."
Abstract

Cited by 38 (20 self)
 Add to MetaCart
The traditional theory of Kolmogorov complexity and algorithmic probability focuses on monotone Turing machines with oneway writeonly output tape. This naturally leads to the universal enumerable SolomonoLevin measure. Here we introduce more general, nonenumerable but cumulatively enumerable measures (CEMs) derived from Turing machines with lexicographically nondecreasing output and random input, and even more general approximable measures and distributions computable in the limit. We obtain a natural hierarchy of generalizations of algorithmic probability and Kolmogorov complexity, suggesting that the "true" information content of some (possibly in nite) bitstring x is the size of the shortest nonhalting program that converges to x and nothing but x on a Turing machine that can edit its previous outputs. Among other things we show that there are objects computable in the limit yet more random than Chaitin's "number of wisdom" Omega, that any approximable measure of x is small for any x lacking a short description, that there is no universal approximable distribution, that there is a universal CEM, and that any nonenumerable CEM of x is small for any x lacking a short enumerating program. We briey mention consequences for universes sampled from such priors.
Algorithmic Theories Of Everything
, 2000
"... The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lac ..."
Abstract

Cited by 32 (15 self)
 Add to MetaCart
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff’s algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin’s Omega, the latter from Levin’s universal search and a natural resourceoriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive Pspecific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
The New AI: General & Sound & Relevant for Physics
, 2003
"... Most traditional artificial intelligence (AI) systems of the past 50 years are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, inducti ..."
Abstract

Cited by 15 (9 self)
 Add to MetaCart
Most traditional artificial intelligence (AI) systems of the past 50 years are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, inductive inference based on Occam's razor, problem solving, decision making, and reinforcement learning in environments of a very general type. Since inductive inference is at the heart of all inductive sciences, some of the results are relevant not only for AI and computer science but also for physics, provoking nontraditional predictions based on Zuse's thesis of the computergenerated universe.
Proof Nets for Intuitionistic Logic
 SAARBRÜCKEN, GERMANY
, 2006
"... Until the beginning of the 20th century, there was no way to reason formally about proofs. In particular, the question of proof equivalence had never been explored. When Hilbert asked in 1920 for an answer to this very question in his famous program, people started looking for proof formalizations.
..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Until the beginning of the 20th century, there was no way to reason formally about proofs. In particular, the question of proof equivalence had never been explored. When Hilbert asked in 1920 for an answer to this very question in his famous program, people started looking for proof formalizations.
Natural deduction and sequent calculi, which were invented by Gentzen in 1935, quickly became two of the main tools for the study of proofs. Gentzen’s Hauptsatz on normal forms for his sequent calculi, and later on Prawitz’ analog theorem for natural deduction, put forth a first notion of equivalent proofs in intuitionistic and classical logic.
However, natural deduction only works well for intuitionistic logic. This is why Girard invented proof nets in 1986 as an analog to natural deduction for (the multiplicative fragment of) linear logic. Their universal structure made proof nets also interesting for other logics. Proof nets have the great advantage that they eliminate most of the bureaucracy involved in deductive systems and so are probably closer to the essence of a proof. There has recently been an increasing interest in the development of proof nets for various kinds of logics. In 2005 for example, Lamarche and Straßburger were able to express sequent proofs in classical logic as proof nets.
In this thesis, I will, starting from proof nets for classical logic, turn the focus back on intuitionistic logic and propose proof nets that are suited as an extension of natural deduction. I will examine these nets and characterize those corresponding to natural deduction proofs. Additionally, I provide a cut elimination procedure for the new proof nets and prove termination and confluence for this reduction system, thus effectively a new notion of the equivalence of intuitionistic proofs.
Comments on Constructive Semantics for Natural Language
, 1996
"... In recent years, an important philosophical discussion has arisen [Dum75, Dum76, Dum91, Pra80], regarding the form that a Theory of Meaning should take. The dispute has its roots in the very principles on which the two important schools on formal philosophy, the Realistic (Classical) and the Antirea ..."
Abstract
 Add to MetaCart
In recent years, an important philosophical discussion has arisen [Dum75, Dum76, Dum91, Pra80], regarding the form that a Theory of Meaning should take. The dispute has its roots in the very principles on which the two important schools on formal philosophy, the Realistic (Classical) and the Antirealistic one are founded. This paper provides a review of the current research in natural language semantics. The main problems confronted by researchers in the field are stated and it is shown that all wellknown formal approaches to natural language semantics (Frege, Montague, Kamp, Heim, Turner's Property Theory, etc.), have been developed so far, along the lines of the realistic or classical school, only until few years ago developments founded on the antirealistic approach have been started. It is the purpose of this paper to make a review of the state of art on this new approach and attempt to explore its possibilities for the future. 1
Epistemic truth and excluded middle*
"... Abstract: Can an epistemic conception of truth and an endorsement of the excluded middle (together with other principles of classical logic abandoned by the intuitionists) cohabit in a plausible philosophical view? In PART I I describe the general problem concerning the relation between the epistemi ..."
Abstract
 Add to MetaCart
Abstract: Can an epistemic conception of truth and an endorsement of the excluded middle (together with other principles of classical logic abandoned by the intuitionists) cohabit in a plausible philosophical view? In PART I I describe the general problem concerning the relation between the epistemic conception of truth and the principle of excluded middle. In PART II I give a historical overview of different attitudes regarding the problem. In PART III I sketch a possible holistic solution. Part I The Problem §1. The epistemic conception of truth. The epistemic conception of truth can be formulated in many ways. But the basic idea is that truth is explained in terms of epistemic notions, like experience, argument, proof, knowledge, etc. One way of formulating this idea is by saying that truth and knowability coincide, i.e. for every statement S
The irreflexivity of Brouwer’s philosophy ∗
, 2000
"... I argue that Brouwer’s general philosophy cannot account for itself, and, a fortiori, cannot lend justification to mathematical principles derived from it. Thus it cannot ground intuitionism, the job Brouwer had intended it to do. The strategy is to ask whether that philosophy actually allows for th ..."
Abstract
 Add to MetaCart
I argue that Brouwer’s general philosophy cannot account for itself, and, a fortiori, cannot lend justification to mathematical principles derived from it. Thus it cannot ground intuitionism, the job Brouwer had intended it to do. The strategy is to ask whether that philosophy actually allows for the kind of knowledge that such an account of itself would amount to. Brouwer tried to go ‘from philosophy to mathematics ’ and grounded his intuitionistic mathematics in a more general philosophy. 1 This background philosophy can be characterized as a transcendental one. That is, it purports to explain how a nonmundane subject builds up its world in consciousness. It is a radical transcendental philosophy in that this ‘world ’ does not contain just physical objects but everything, including abstract objects and the mundane subject (the subject as part of the world). From the empirical point of view, such a nonmundane subject is an idealized one. Like fellow transcendentalists
1 The New AI is General & Mathematically Rigorous
"... Summary. Most traditional artificial intelligence (AI) systems of the past decades are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, ..."
Abstract
 Add to MetaCart
Summary. Most traditional artificial intelligence (AI) systems of the past decades are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, inductive inference based on Occam’s razor, problem solving, decision making, and reinforcement learning in environments of a very general type. Since inductive inference is at the heart of all inductive sciences, some of the results are relevant not only for AI and computer science but also for physics, provoking nontraditional predictions based on Zuse’s thesis of the computergenerated universe. We first briefly review the history of AI since Gödel’s 1931 paper, then discuss recent post2000 approaches that are currently transforming general AI research into a formal science.
The Fastest Way of Computing All Universes
"... Is there a short and fast program that can compute the precise history of our universe, including all seemingly random but possibly actually deterministic and pseudorandom quantum fluctuations? There is no physical evidence against this possibility. So let us start searching! We already know a shor ..."
Abstract
 Add to MetaCart
Is there a short and fast program that can compute the precise history of our universe, including all seemingly random but possibly actually deterministic and pseudorandom quantum fluctuations? There is no physical evidence against this possibility. So let us start searching! We already know a short program that computes all constructively computable universes in parallel, each in the asymptotically fastest way. Assuming ours is computed by this optimal method, we can predict that it is among the fastest compatible with our existence. This yields testable predictions. Note: This paper extends an overview of previous work 51–54,58,59 presented in a survey for the German edition of Scientific American. 61 1.
PrimaVera Working Paper Series PrimaVera Working Paper 200602 Semiotics of identity management
, 2006
"... Copyright ©2006 by the Universiteit van Amsterdam All rights reserved. No part of this article may be reproduced or utilized in any form or by any means, electronic of mechanical, including photocopying, recording, or by any information storage and retrieval system, without permission in writing fro ..."
Abstract
 Add to MetaCart
Copyright ©2006 by the Universiteit van Amsterdam All rights reserved. No part of this article may be reproduced or utilized in any form or by any means, electronic of mechanical, including photocopying, recording, or by any information storage and retrieval system, without permission in writing from the authors. Semiotics of identity management Semiotics of identity management