Results 1  10
of
10
Informationtheoretic Limitations of Formal Systems
 JOURNAL OF THE ACM
, 1974
"... An attempt is made to apply informationtheoretic computational complexity to metamathematics. The paper studies the number of bits of instructions that must be a given to a computer for it to perform finite and infinite tasks, and also the amount of time that it takes the computer to perform these ..."
Abstract

Cited by 45 (7 self)
 Add to MetaCart
An attempt is made to apply informationtheoretic computational complexity to metamathematics. The paper studies the number of bits of instructions that must be a given to a computer for it to perform finite and infinite tasks, and also the amount of time that it takes the computer to perform these tasks. This is applied to measuring the difficulty of proving a given set of theorems, in terms of the number of bits of axioms that are assumed, and the size of the proofs needed to deduce the theorems from the axioms.
Databases and Higher Types
 Computational Logic—CL 2000
, 2000
"... . Generalized databases will be examined, in which attributes can be sets of attributes, or sets of sets of attributes, and other higher type constructs. A precise semantics will be developed for such databases, based on a higher type modal/intensional logic. 1 Introduction In some ways this is an ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
. Generalized databases will be examined, in which attributes can be sets of attributes, or sets of sets of attributes, and other higher type constructs. A precise semantics will be developed for such databases, based on a higher type modal/intensional logic. 1 Introduction In some ways this is an eccentric paperthere are no theorems. What I want to do, simply stated, is present a semantics for relational databases. But the semantics is rich, powerful, and oddly familiar, and applies to databases that are quite general. It is a topic whose exploration I wish to recommend, rather than a finished product I simply present. Relational databases generally have entities of some kind as values of attributes, though it is a small stretch to allow sets of entities as well. I want to consider databases that stretch things further, allowing attributes to have as values sets of sets of entities, and so on, but further, I also want to allow sets of attributes, sets of sets of attributes, and so...
Unbounded prooflength speedup in deduction modulo
 CSL 2007, VOLUME 4646 OF LNCS
, 2007
"... In 1973, Parikh proved a speedup theorem conjectured by Gödel 37 years before: there exist arithmetical formulæ that are provable in first order arithmetic, but whose shorter proof in second order arithmetic is arbitrarily smaller than any proof in first order. On the other hand, resolution for h ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
In 1973, Parikh proved a speedup theorem conjectured by Gödel 37 years before: there exist arithmetical formulæ that are provable in first order arithmetic, but whose shorter proof in second order arithmetic is arbitrarily smaller than any proof in first order. On the other hand, resolution for higher order logic can be simulated step by step in a first order narrowing and resolution method based on deduction modulo, whose paradigm is to separate deduction and computation to make proofs clearer and shorter. We prove that i+1th order arithmetic can be linearly simulated into ith order arithmetic modulo some confluent and terminating rewrite system. We also show that there exists a speedup between ith order arithmetic modulo this system and ith order arithmetic without modulo. All this allows us to prove that the speedup conjectured by Gödel does not come from the deductive part of the proofs, but can be expressed as simple computation, therefore justifying the use of deduction modulo as an efficient first order setting simulating higher order.
Efficiently Simulating HigherOrder Arithmetic by a FirstOrder Theory Modulo
"... Deduction modulo is a paradigm which consists in applying the inference rules of a deductive system—such as for instance natural deduction—modulo a rewrite system over terms and propositions. It has been shown that higherorder logic can be simulated into the firstorder natural deduction modulo. Ho ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Deduction modulo is a paradigm which consists in applying the inference rules of a deductive system—such as for instance natural deduction—modulo a rewrite system over terms and propositions. It has been shown that higherorder logic can be simulated into the firstorder natural deduction modulo. However, a theorem stated by Gödel and proved by Parikh expresses that proofs in secondorder arithmetic may be unboundedly shorter than proofs in firstorder arithmetic, even when considering only formulæ provable in firstorder arithmetic. We investigate how deduction modulo can be used to translate proofs of higherorder arithmetic into firstorder proofs without inflating their length. First we show how higher orders can be encoded through a quite simple (finite, terminating, confluent, leftlinear) rewrite system. Then, a proof in higherorder arithmetic can be linearly translated into a proof in firstorder arithmetic modulo this system. Second, in the continuation of a work of Dowek and Werner, we show how to express the whole higherorder arithmetic as a rewrite system. Then, proofs of higherorder arithmetic can be linearly translated into proofs in the empty theory modulo this rewrite system. These results show that the speedup between firstand secondorder arithmetic, and more generally between i th and i +1 storder arithmetic, can in fact be expressed as computation, and does not lie in the really deductive part of the proofs.
HigherOrder Modal Logic  A Sketch
"... . Firstorder modal logic, in the usual formulations, is not sufficiently expressive, and as a consequence problems like Frege's morning star/evening star puzzle arise. The introduction of predicate abstraction machinery provides a natural extension in which such di#culties can be addressed. But thi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
. Firstorder modal logic, in the usual formulations, is not sufficiently expressive, and as a consequence problems like Frege's morning star/evening star puzzle arise. The introduction of predicate abstraction machinery provides a natural extension in which such di#culties can be addressed. But this machinery can also be thought of as part of a move to a full higherorder modal logic. In this paper we present a sketch of just such a higherorder modal logic: its formal semantics, and a proof procedure using tableaus. Naturally the tableau rules are not complete, but they are with respect to a Henkinization of the "true" semantics. We demonstrate the use of the tableau rules by proving one of the theorems involved in Godel's ontological argument, one of the rare instances in the literature where higherorder modal constructs have appeared. A fuller treatment of the material presented here is in preparation. 1 Introduction Standard firstorder classical logic is so well behaved that co...
Inheritable Properties and Computer Assisted Proofs in Dynamics
"... 0 Introduction. Computer assisted proofs are becoming an important factor of present day mathematics. Since there exist short theorems having arbitrarily long proofs (a consequence of G"odel's incompletness theorems, see [4, 6, 23]) and the power of computers is growing exponentially, this shoudbe e ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
0 Introduction. Computer assisted proofs are becoming an important factor of present day mathematics. Since there exist short theorems having arbitrarily long proofs (a consequence of G"odel's incompletness theorems, see [4, 6, 23]) and the power of computers is growing exponentially, this shoudbe expected. The general scheme for constructing computer assisted proofs is actually very simple and is a direct consequence of the fact that any computer may deal only with finite sets.
A Note On Monte Carlo Primality Tests And Algorithmic Information Theory
, 1978
"... Solovay and Strassen, and Miller and Rabin have discovered fast algorithms for testing primality which use coinflipping and whose con1 2 G. J. Chaitin clusions are only probably correct. On the other hand, algorithmic information theory provides a precise mathematical definition of the notion of ra ..."
Abstract
 Add to MetaCart
Solovay and Strassen, and Miller and Rabin have discovered fast algorithms for testing primality which use coinflipping and whose con1 2 G. J. Chaitin clusions are only probably correct. On the other hand, algorithmic information theory provides a precise mathematical definition of the notion of random or patternless sequence. In this paper we shall describe conditions under which if the sequence of coin tosses in the Solovay Strassen and MillerRabin algorithms is replaced by a sequence of heads and tails that is of maximal algorithmic information content, i.e., has maximal algorithmic randomness, then one obtains an errorfree test for primality. These results are only of theoretical interest, since it is a manifestation of the Godel incompleteness phenomenon that it is impossible to "certify" a sequence to be random by means of a proof, even though most sequences have this property. Thus by using certified random sequences one can in principle, but not in practice, convert proba...
Models, Rules, Deductive Reasoning
, 1999
"... We formulate a simple theory of deductive reasoning based on mental models. One prediction of the theory is experimentally tested and found to be incorrect. The bearing of our results on contemporary theories of mental models is discussed. We then consider a potential objection to current ruletheor ..."
Abstract
 Add to MetaCart
We formulate a simple theory of deductive reasoning based on mental models. One prediction of the theory is experimentally tested and found to be incorrect. The bearing of our results on contemporary theories of mental models is discussed. We then consider a potential objection to current ruletheories of deduction. Such theories picture deductive reasoning as the successive application of inferenceschemata from #rstorder logic. Relying on a theorem due to George Boolos, we show that under weak hypotheses #rstorder schemata cannot account for many people's abilitytoverify the validity of #rstorder arguments. The hypothesis that deductive reasoning is mediated by the construction of mental models has enjoyed predictive success across several studies. It has also proven to be a fertile source of ideas about other kinds of judgment, for example, temporal, spatial, and probabilistic. At the same time, the theory has su#ered from persistent criticism for ambiguity about the details of ...
❏ Antikes Rom: Cicero (50 v.Chr.)
"... Gottesbeweis“ ➢ Als Gottesbeweis bezeichnet man im Allgemeinen Versuche, die Existenz (eines) Gottes zu beweisen, bzw. Argumente für eine solche Existenz zu finden. [Wik04][Bec00] ..."
Abstract
 Add to MetaCart
Gottesbeweis“ ➢ Als Gottesbeweis bezeichnet man im Allgemeinen Versuche, die Existenz (eines) Gottes zu beweisen, bzw. Argumente für eine solche Existenz zu finden. [Wik04][Bec00]
Extending Conceptual Spaces
"... As noted in Section 2.3, Gärdenfors ’ conceptual spaces theory of concepts is an example of a similarity space theory, which on the face of things puts it in company with prototype and exemplar theories of concepts, in the empirical tradition, and in contrast with e.g. theory theory and informationa ..."
Abstract
 Add to MetaCart
As noted in Section 2.3, Gärdenfors ’ conceptual spaces theory of concepts is an example of a similarity space theory, which on the face of things puts it in company with prototype and exemplar theories of concepts, in the empirical tradition, and in contrast with e.g. theory theory and informational atomism, in the rationalist tradition – though that, I will argue, would at best be an oversimplification, and indeed, as is clear from passages in his book, Gärdenfors sees his theory as being compatible with and complementary to these other approaches. 1 Specifically, Gärdenfors sees his approach as a bridging account between different levels of explanation of cognition more generally, and different accounts of concepts more specifically. Indeed, I believe his theory is well placed to support the “toggling effect ” thesis I proposed in the last chapter. Sometimes with similarityspacebased theories, concepts that are more similar are grouped closer together (more literally or more metaphorically, depending on the theory), while those that are more dissimilar are grouped further apart. Fodor has argued, quite convincingly I think, that such an approach is doomed to failure, for invariably the measures of similarity that are being assumed depend upon an underlying layer of strict identity.(1998, p. 32) For example, if two prototypes are more similar or less similar depending on how many features they share, then those features must,