Results 1  10
of
19
Hierarchies Of Generalized Kolmogorov Complexities And Nonenumerable Universal Measures Computable In The Limit
 INTERNATIONAL JOURNAL OF FOUNDATIONS OF COMPUTER SCIENCE
, 2000
"... The traditional theory of Kolmogorov complexity and algorithmic probability focuses on monotone Turing machines with oneway writeonly output tape. This naturally leads to the universal enumerable SolomonoLevin measure. Here we introduce more general, nonenumerable but cumulatively enumerable m ..."
Abstract

Cited by 40 (21 self)
 Add to MetaCart
(Show Context)
The traditional theory of Kolmogorov complexity and algorithmic probability focuses on monotone Turing machines with oneway writeonly output tape. This naturally leads to the universal enumerable SolomonoLevin measure. Here we introduce more general, nonenumerable but cumulatively enumerable measures (CEMs) derived from Turing machines with lexicographically nondecreasing output and random input, and even more general approximable measures and distributions computable in the limit. We obtain a natural hierarchy of generalizations of algorithmic probability and Kolmogorov complexity, suggesting that the "true" information content of some (possibly in nite) bitstring x is the size of the shortest nonhalting program that converges to x and nothing but x on a Turing machine that can edit its previous outputs. Among other things we show that there are objects computable in the limit yet more random than Chaitin's "number of wisdom" Omega, that any approximable measure of x is small for any x lacking a short description, that there is no universal approximable distribution, that there is a universal CEM, and that any nonenumerable CEM of x is small for any x lacking a short enumerating program. We briey mention consequences for universes sampled from such priors.
Algorithmic Theories Of Everything
, 2000
"... The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lac ..."
Abstract

Cited by 32 (15 self)
 Add to MetaCart
(Show Context)
The probability distribution P from which the history of our universe is sampled represents a theory of everything or TOE. We assume P is formally describable. Since most (uncountably many) distributions are not, this imposes a strong inductive bias. We show that P(x) is small for any universe x lacking a short description, and study the spectrum of TOEs spanned by two Ps, one reflecting the most compact constructive descriptions, the other the fastest way of computing everything. The former derives from generalizations of traditional computability, Solomonoff’s algorithmic probability, Kolmogorov complexity, and objects more random than Chaitin’s Omega, the latter from Levin’s universal search and a natural resourceoriented postulate: the cumulative prior probability of all x incomputable within time t by this optimal algorithm should be 1/t. Between both Ps we find a universal cumulatively enumerable measure that dominates traditional enumerable measures; any such CEM must assign low probability to any universe lacking a short enumerating program. We derive Pspecific consequences for evolving observers, inductive reasoning, quantum physics, philosophy, and the expected duration of our universe.
The New AI: General & Sound & Relevant for Physics
 ARTIFICIAL GENERAL INTELLIGENCE (ACCEPTED 2002)
, 2003
"... Most traditional artificial intelligence (AI) systems of the past 50 years are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, induct ..."
Abstract

Cited by 16 (9 self)
 Add to MetaCart
Most traditional artificial intelligence (AI) systems of the past 50 years are either very limited, or based on heuristics, or both. The new millennium, however, has brought substantial progress in the field of theoretically optimal and practically feasible algorithms for prediction, search, inductive inference based on Occam’s razor, problem solving, decision making, and reinforcement learning in environments of a very general type. Since inductive inference is at the heart of all inductive sciences, some of the results are relevant not only for AI and computer science but also for physics, provoking nontraditional predictions based on Zuse’s thesis of the computergenerated universe.
From a Brouwerian point of view
 Philosophia Mathematica
, 1998
"... In the paper below we will discuss a number of topics that are central in Brouwer’s intuitionism. A complete treatment is beyond the scope of the paper, the reader may find it a useful introduction to Brouwer’s papers. There are a number of loosely related notions and schools of constructivism in m ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In the paper below we will discuss a number of topics that are central in Brouwer’s intuitionism. A complete treatment is beyond the scope of the paper, the reader may find it a useful introduction to Brouwer’s papers. There are a number of loosely related notions and schools of constructivism in mathematics; in some cases there is only an attempt to capture certain constructive notions and procedures in the existing body of mathematics, in other, more fundamental, cases the object is to reconstruct mathematics as a whole within the frame work of a constructivistic philosophy. It is almost a platitude to state that constructivism has always been around in mathematics, and indeed, a, say, eighteenth century mathematician would have accepted the constructivist claim of his twentieth century colleague of the mild variety, i.e. the practical nondogmatic practitioner, as selfevident and rather commonplace. The issue of constructivism in mathematics only became urgent after the discovery of abstract, noneffective techniques and notions. The watershed is David Hilbert’s famous solution
Proof Nets for Intuitionistic Logic
 SAARBRÜCKEN, GERMANY
, 2006
"... Until the beginning of the 20th century, there was no way to reason formally about proofs. In particular, the question of proof equivalence had never been explored. When Hilbert asked in 1920 for an answer to this very question in his famous program, people started looking for proof formalizations.
..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Until the beginning of the 20th century, there was no way to reason formally about proofs. In particular, the question of proof equivalence had never been explored. When Hilbert asked in 1920 for an answer to this very question in his famous program, people started looking for proof formalizations.
Natural deduction and sequent calculi, which were invented by Gentzen in 1935, quickly became two of the main tools for the study of proofs. Gentzen’s Hauptsatz on normal forms for his sequent calculi, and later on Prawitz’ analog theorem for natural deduction, put forth a first notion of equivalent proofs in intuitionistic and classical logic.
However, natural deduction only works well for intuitionistic logic. This is why Girard invented proof nets in 1986 as an analog to natural deduction for (the multiplicative fragment of) linear logic. Their universal structure made proof nets also interesting for other logics. Proof nets have the great advantage that they eliminate most of the bureaucracy involved in deductive systems and so are probably closer to the essence of a proof. There has recently been an increasing interest in the development of proof nets for various kinds of logics. In 2005 for example, Lamarche and Straßburger were able to express sequent proofs in classical logic as proof nets.
In this thesis, I will, starting from proof nets for classical logic, turn the focus back on intuitionistic logic and propose proof nets that are suited as an extension of natural deduction. I will examine these nets and characterize those corresponding to natural deduction proofs. Additionally, I provide a cut elimination procedure for the new proof nets and prove termination and confluence for this reduction system, thus effectively a new notion of the equivalence of intuitionistic proofs.
Kolmogorov and Brouwer on constructive implication and the Ex Falso rule
 Russian Math Surveys
"... Kolmogorov put his stamp on many subjects in mathematics, he was in every sense an example of the universal mathematician. Among the long list of topics, logic figures prominently. Kolmogorov contributed to a new subject, that at his time did not exist: intuitionistic logic. His first paper on ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Kolmogorov put his stamp on many subjects in mathematics, he was in every sense an example of the universal mathematician. Among the long list of topics, logic figures prominently. Kolmogorov contributed to a new subject, that at his time did not exist: intuitionistic logic. His first paper on
PrimaVera Working Paper Series PrimaVera Working Paper 200602 Semiotics of identity management
, 2006
"... Copyright ©2006 by the Universiteit van Amsterdam All rights reserved. No part of this article may be reproduced or utilized in any form or by any means, electronic of mechanical, including photocopying, recording, or by any information storage and retrieval system, without permission in writing fro ..."
Abstract
 Add to MetaCart
(Show Context)
Copyright ©2006 by the Universiteit van Amsterdam All rights reserved. No part of this article may be reproduced or utilized in any form or by any means, electronic of mechanical, including photocopying, recording, or by any information storage and retrieval system, without permission in writing from the authors. Semiotics of identity management Semiotics of identity management
The irreflexivity of Brouwer’s philosophy ∗
, 2000
"... I argue that Brouwer’s general philosophy cannot account for itself, and, a fortiori, cannot lend justification to mathematical principles derived from it. Thus it cannot ground intuitionism, the job Brouwer had intended it to do. The strategy is to ask whether that philosophy actually allows for th ..."
Abstract
 Add to MetaCart
(Show Context)
I argue that Brouwer’s general philosophy cannot account for itself, and, a fortiori, cannot lend justification to mathematical principles derived from it. Thus it cannot ground intuitionism, the job Brouwer had intended it to do. The strategy is to ask whether that philosophy actually allows for the kind of knowledge that such an account of itself would amount to. Brouwer tried to go ‘from philosophy to mathematics ’ and grounded his intuitionistic mathematics in a more general philosophy. 1 This background philosophy can be characterized as a transcendental one. That is, it purports to explain how a nonmundane subject builds up its world in consciousness. It is a radical transcendental philosophy in that this ‘world ’ does not contain just physical objects but everything, including abstract objects and the mundane subject (the subject as part of the world). From the empirical point of view, such a nonmundane subject is an idealized one. Like fellow transcendentalists
Epistemic truth and excluded middle*
"... Abstract: Can an epistemic conception of truth and an endorsement of the excluded middle (together with other principles of classical logic abandoned by the intuitionists) cohabit in a plausible philosophical view? In PART I I describe the general problem concerning the relation between the epistemi ..."
Abstract
 Add to MetaCart
Abstract: Can an epistemic conception of truth and an endorsement of the excluded middle (together with other principles of classical logic abandoned by the intuitionists) cohabit in a plausible philosophical view? In PART I I describe the general problem concerning the relation between the epistemic conception of truth and the principle of excluded middle. In PART II I give a historical overview of different attitudes regarding the problem. In PART III I sketch a possible holistic solution. Part I The Problem §1. The epistemic conception of truth. The epistemic conception of truth can be formulated in many ways. But the basic idea is that truth is explained in terms of epistemic notions, like experience, argument, proof, knowledge, etc. One way of formulating this idea is by saying that truth and knowability coincide, i.e. for every statement S
De Bruijn's Automath and Pure Type Systems
"... We study the position of the Automath systems within the framework of Pure Type Systems (PTSs). In [2, 22], a rough relationship has been given between Automath and PTSs. That relationship ignores three of the most important features of Automath: definitions, parameters and reduction, because at th ..."
Abstract
 Add to MetaCart
(Show Context)
We study the position of the Automath systems within the framework of Pure Type Systems (PTSs). In [2, 22], a rough relationship has been given between Automath and PTSs. That relationship ignores three of the most important features of Automath: definitions, parameters and reduction, because at the time, formulations of PTSs did not have these features. Since, PTSs have been extended with these features and in view of this, we revisit the correspondence between Automath and PTSs. This paper gives the most accurate description of Automath as a PTS so far.