Results 1  10
of
26
Explaining Type Inference
 Science of Computer Programming
, 1995
"... Type inference is the compiletime process of reconstructing missing type information in a program based on the usage of its variables. ML and Haskell are two languages where this aspect of compilation has enjoyed some popularity, allowing type information to be omitted while static type checking is ..."
Abstract

Cited by 53 (0 self)
 Add to MetaCart
Type inference is the compiletime process of reconstructing missing type information in a program based on the usage of its variables. ML and Haskell are two languages where this aspect of compilation has enjoyed some popularity, allowing type information to be omitted while static type checking is still performed. Type inference may be expected to have some application in the prototyping and scripting languages which are becoming increasingly popular. A difficulty with type inference is the confusing and sometimes counterintuitive diagnostics produced by the type checker as a result of type errors. A modification of the HindleyMilner type inference algorithm is presented, which allows the specific reasoning which led to a program variable having a particular type to be recorded for type explanation. This approach is close to the intuitive process used in practice for debugging type errors. 1 Introduction Type inference refers to the compiletime process of reconstructing missing t...
Colouring Terms to Control Equational Reasoning
 Journal of Automated Reasoning
, 1997
"... . In this paper we present an approach to prove the equality between terms in a goaldirected way developed in the field of inductive theorem proving. The two terms to be equated are syntactically split into expressions which are common to both and those which occur only in one term. According to the ..."
Abstract

Cited by 25 (13 self)
 Add to MetaCart
. In this paper we present an approach to prove the equality between terms in a goaldirected way developed in the field of inductive theorem proving. The two terms to be equated are syntactically split into expressions which are common to both and those which occur only in one term. According to the computed differences we apply appropriate equations to the terms in order to reduce the differences in a goaldirected way. Although this approach was developed for purposes of inductive theorem proving  we use this technique to manipulate the conclusion of an induction step to enable the use of the hypothesis  it is a powerful method for the control of equational reasoning in general. 1. Introduction The automation of equational reasoning is one of the most important obstacles in the field of automating deductions. Even small equational problems result in a huge search space, and finding a proof often fails due to the combinatorial explosion. Proving (conditional) equations by inductio...
PARTHEO: A High Performance Parallel Theorem Prover
 In Proceedings of the Tenth International Conference on Automated Deduction
, 1990
"... . PARTHEO, a sound and complete orparallel theorem prover for first order logic is presented. The proof calculus is model elimination. PARTHEO consists of a uniform network of sequential theorem provers communicating via message passing. Each sequential prover is implemented as an extension of Warr ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
. PARTHEO, a sound and complete orparallel theorem prover for first order logic is presented. The proof calculus is model elimination. PARTHEO consists of a uniform network of sequential theorem provers communicating via message passing. Each sequential prover is implemented as an extension of Warren's abstract machine. PARTHEO is written in parallel C and running on a network of 16 transputers. The paper comprises the system architecture, the theoretical background, details of the implementation, and results of performance measurements. Keywords. Automated deduction, theorem proving, first order logic, connection method, model elimination, orparallelism, message passing, transputers. 1 Introduction AI systems are in constant need of powerful and efficient inference mechanisms. Within many areas of application logicbased languages seem to be an appropriate choice. A popular and successful realization is Prolog, which is moderately efficient, but lacks in expressiveness, mainly bec...
On the complexity of the reflected logic of proofs
 Theoretical Computer Science
"... disjunctive property, complexity. Artemov’s system LP captures all propositional invariant properties of a proof predicate “x proves y ” ([1, 3]). Kuznets in [5] showed that the satisfiability problem for LP belongs to the class Π p 2 of the polynomial hierarchy. No nontrivial lower complexity bound ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
disjunctive property, complexity. Artemov’s system LP captures all propositional invariant properties of a proof predicate “x proves y ” ([1, 3]). Kuznets in [5] showed that the satisfiability problem for LP belongs to the class Π p 2 of the polynomial hierarchy. No nontrivial lower complexity bound for LP is known. We describe quite expressive syntactical fragment of LP which belongs to NP. It is rLP∧, ∨ – the set of all theorems of LP which are monotone boolean combinations of quasiatomic formulas (facts of sort “t proves F ”). A new decision algorithm for this fragment is proposed. It is based on a new simple independent formalization for rLP (the reflected fragment of LP) and involves the corresponding proof search procedure. Essentially rLP contains all the theorems of LP supplied with additional information about their proofs. We show that in many respects rLP is simpler than LP itself. This gives the complexity bound (NP) for rLP. In addition we prove a suitable variant of the disjunctive property which extends this bound to rLP∧,∨. 1 1
G.: A uniform approach to constraintsolving for lists, multisets, compact lists, and sets
 ACM Trans. Comput. Log
, 2008
"... Lists, multisets, and sets are wellknown data structures whose usefulness is widely recognized in various areas of Computer Science. They have been analyzed from an axiomatic point of view with a parametric approach in [Dovier et al. 1998] where the relevant unification algorithms have been develop ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
Lists, multisets, and sets are wellknown data structures whose usefulness is widely recognized in various areas of Computer Science. They have been analyzed from an axiomatic point of view with a parametric approach in [Dovier et al. 1998] where the relevant unification algorithms have been developed. In this paper we extend these results considering more general constraints, namely equality and membership constraints and their negative counterparts.
Logical omniscience via proof complexity
 In Computer Science Logic 2006, Lecture Notes in Computer Science, Vol 4207
, 2006
"... Abstract. The Hintikkastyle modal logic approach to knowledge has a wellknown defect of logical omniscience, i.e., an unrealistic feature that an agent knows all logical consequences of her assumptions. In this paper we suggest the following Logical Omniscience Test (LOT): an epistemic system E is ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
Abstract. The Hintikkastyle modal logic approach to knowledge has a wellknown defect of logical omniscience, i.e., an unrealistic feature that an agent knows all logical consequences of her assumptions. In this paper we suggest the following Logical Omniscience Test (LOT): an epistemic system E is not logically omniscient if for any valid in E knowledge assertion A of type ‘F is known ’ there is a proof of F in E, the complexity of which is bounded by some polynomial in the length of A. We show that the usual epistemic modal logics are logically omniscient (modulo some common complexity assumptions). We also apply LOT to Justification Logic, which along with the usual knowledge operator Ki(F) (‘agent i knows F ’) contain evidence assertions t:F (‘t is a justification for F ’). In Justification Logic, the evidence part is an appropriate extension of the Logic of Proofs LP, which guarantees that the collection of evidence terms t is rich enough to match modal logic. We show that justification logic systems are logically omniscient w.r.t. the usual knowledge and are not logically omniscient w.r.t. the evidencebased knowledge. 1
RuleBased Constraint Programming
 Fundamenta Informaticae
, 1998
"... In this paper we present a view of constraint programming based on the notion of rewriting controlled by strategies. We argue that this concept allows us to describe in a unified way the constraint solving mechanism as well as the metalanguage needed to manipulate the constraints. This has the a ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In this paper we present a view of constraint programming based on the notion of rewriting controlled by strategies. We argue that this concept allows us to describe in a unified way the constraint solving mechanism as well as the metalanguage needed to manipulate the constraints. This has the advantage to provide descriptions that are very close to the proof theoretical setting used now to describe constraint manipulations like unification or numerical constraint solving. We examplify the approach by presenting examples of constraint solvers descriptions and combinations written in the ELAN language. 1
Efficient execution in an automated reasoning environment
 Journal of Functional Programming
, 2006
"... Abstract We describe a method to permit the user of a mathematical logic to write elegant logical definitions while allowing sound and efficient execution. We focus on the ACL2 logic and automated reasoning environment. ACL2 is used by industrial researchers to describe microprocessor designs and ot ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
Abstract We describe a method to permit the user of a mathematical logic to write elegant logical definitions while allowing sound and efficient execution. We focus on the ACL2 logic and automated reasoning environment. ACL2 is used by industrial researchers to describe microprocessor designs and other complicated digital systems. Properties of the designs can be formally established with the theorem prover. But because ACL2 is also a functional programming language, the formal models can be executed as simulation engines. We implement features that afford these dual applications, namely formal proof and execution on industrial test suites. In particular, the features allow the user to install, in a logically sound way, alternative executable counterparts for logicallydefined functions. These alternatives are often much more efficient than the logically equivalent terms they replace. We discuss several applications of these features. 1 Introduction This paper is about a way to permit the functional programmer to prove efficientprograms correct. The idea is to allow the provision of two definitions of the program: an elegant definition that supports effective reasoning by a mechanizedtheorem prover, and an efficient definition for evaluation. A bridge of this sort,
Referential logic of proofs
 Theoretical Computer Science
"... We introduce an extension of the propositional logic of singleconclusion proofs by the second order variables denoting the reference constructors of the type “the formula which is proved by x. ” The resulting Logic of Proofs with References, FLPref, is shown to be decidable, and to enjoy soundness ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We introduce an extension of the propositional logic of singleconclusion proofs by the second order variables denoting the reference constructors of the type “the formula which is proved by x. ” The resulting Logic of Proofs with References, FLPref, is shown to be decidable, and to enjoy soundness and completeness with respect to the intended provability semantics. We show that FLPref provides a complete test of admissibility of inference rules in a sound extension of arithmetic. Key words: proof theory, explicit modal logic, single conclusion logic of proofs, proof term, reference, unification, admissible inference rule. 1
Style  A Practical Type Checker for Scheme
, 1993
"... This paper describes an new tool for finding errors in R 4 RScompliant Scheme programs. A polymorphic type system in the style of Damas & Milner (1982) with an additional maximum type is used to type Scheme code. Although Scheme is dynamically typed, most parts of programs are statically typeable ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This paper describes an new tool for finding errors in R 4 RScompliant Scheme programs. A polymorphic type system in the style of Damas & Milner (1982) with an additional maximum type is used to type Scheme code. Although Scheme is dynamically typed, most parts of programs are statically typeable; type inconsistencies are regarded as hints to possible programming errors. The paper first introduces a type system which is a careful balance between rigorous type safety and pragmatic type softness. An efficient and portable implementation based on order sorted unification in Scheme is then described. We obtained very satisfactory results on realistic programs, including the programs in Abelson, Sussman & Sussman (1985). 1 Introduction Finding errors in Scheme programs is painful. One major reason is that Scheme is a dynamically typed language: all names in Scheme programs can hold objects of arbitrary, undeclared type which may change during runtime. Unlike most modern languages like H...