Results 1 
7 of
7
A Theory of Program Size Formally Identical to Information Theory
, 1975
"... A new definition of programsize complexity is made. H(A;B=C;D) is defined to be the size in bits of the shortest selfdelimiting program for calculating strings A and B if one is given a minimalsize selfdelimiting program for calculating strings C and D. This differs from previous definitions: (1) ..."
Abstract

Cited by 329 (16 self)
 Add to MetaCart
A new definition of programsize complexity is made. H(A;B=C;D) is defined to be the size in bits of the shortest selfdelimiting program for calculating strings A and B if one is given a minimalsize selfdelimiting program for calculating strings C and D. This differs from previous definitions: (1) programs are required to be selfdelimiting, i.e. no program is a prefix of another, and (2) instead of being given C and D directly, one is given a program for calculating them that is minimal in size. Unlike previous definitions, this one has precisely the formal 2 G. J. Chaitin properties of the entropy concept of information theory. For example, H(A;B) = H(A) + H(B=A) + O(1). Also, if a program of length k is assigned measure 2 \Gammak , then H(A) = \Gamma log 2 (the probability that the standard universal computer will calculate A) +O(1). Key Words and Phrases: computational complexity, entropy, information theory, instantaneous code, Kraft inequality, minimal program, probab...
Informationtheoretic Limitations of Formal Systems
 JOURNAL OF THE ACM
, 1974
"... An attempt is made to apply informationtheoretic computational complexity to metamathematics. The paper studies the number of bits of instructions that must be a given to a computer for it to perform finite and infinite tasks, and also the amount of time that it takes the computer to perform these ..."
Abstract

Cited by 45 (7 self)
 Add to MetaCart
An attempt is made to apply informationtheoretic computational complexity to metamathematics. The paper studies the number of bits of instructions that must be a given to a computer for it to perform finite and infinite tasks, and also the amount of time that it takes the computer to perform these tasks. This is applied to measuring the difficulty of proving a given set of theorems, in terms of the number of bits of axioms that are assumed, and the size of the proofs needed to deduce the theorems from the axioms.
Experience with the SETL optimizer
 ACM Transactions on Programming Languages and Systems
, 1983
"... The structure of an existing optimizer for the very highlevel, set theoretically oriented programming language SETL is described, and its capabilities are illustrated. The use of novel techniques (supported by stateoftheart interprocedural program analysis methods) enables the optimizer to accom ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
The structure of an existing optimizer for the very highlevel, set theoretically oriented programming language SETL is described, and its capabilities are illustrated. The use of novel techniques (supported by stateoftheart interprocedural program analysis methods) enables the optimizer to accomplish various sophisticated optimizations, the most significant of which are the automatic selection of data representations and the systematic elimination of superfluous copying operations. These techniques allow quite sophisticated datastructure choices to be made automatically. Categories and Subject Descriptors: D.3.2 [Programmiug Languages]: Language Classificationsvery highlevel languages; SETL; D.3.4 [Programming Languages]: ProcessorscompUers; optimization; 1.2.2 [Artificial Intelligence]: Automatic Programmingautomatic analysis of algorithms; program modification; program transformation
Program Derivation With Verified Transformations  A Case Study
, 1995
"... A program development methodology based on verified program transformations is described and illustrated through derivations of a high level bisimulation algorithm and an improved minimumstate DFA algorithm. Certain doubts that were raised about the correctness of an initial paperandpencil deriva ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
A program development methodology based on verified program transformations is described and illustrated through derivations of a high level bisimulation algorithm and an improved minimumstate DFA algorithm. Certain doubts that were raised about the correctness of an initial paperandpencil derivation of the DFA minimizationalgorithm were laid to rest by machinechecked formal proofs of the most difficult derivational steps. Although the protracted labor involved in designing and checking these proofs was almost overwhelming, the expense was somewhat offset by a successful reuse of major portions of these proofs. In particular, the DFA minimization algorithm is obtained by specializing and then extending the last step in the derivation of the high level bisimulation algorithm. Our experience suggests that a major focus of future research should be aimed towards improving the technology of machine checkable proofs  their construction, presentation, and reuse. This paper demonstrat...
The Formal Reconstruction and Speedup Of The Linear Time Fragment Of Willard's Relational Calculus Subset
 In Bird and Meertens, editors, Algorithmic Languages and Calculi
, 1997
"... We demonstrate how several programming language concepts and methods can be used economically to obtain an improved solution to a difficult algorithmic problem. The problem is to compile a subset RCS of Relational Calculus defined by Willard (1978) in a novel way so that efficient runtime query per ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
We demonstrate how several programming language concepts and methods can be used economically to obtain an improved solution to a difficult algorithmic problem. The problem is to compile a subset RCS of Relational Calculus defined by Willard (1978) in a novel way so that efficient runtime query performance is guaranteed. Willard gives an algorithm to compile each query q belonging to RCS so that it executes in O(n log d n+o) steps and O(n) space, where n and o are respectively the input and output set sizes, and d is a parameter associated with the syntax of query q. Willard's time bounds are based on the assumption that hashing unitspace data takes unit time. In this paper we use a settheoretic complexity measure and formal transformational techniques to reconstruct the linear time fragment of RCS in a simplified way. In doing this, we show how complexity can be determined by language abstraction and algebraic reasoning without resorting to low level counting arguments. This ap...
SETL for Internet Data Processing
, 2000
"... hereby granted, provided that this notice and the reference ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
hereby granted, provided that this notice and the reference