Results 11 
18 of
18
HigherOrder Constraint Simplification In Dependent Type Theory
"... Abstract. Higherorder unification is undecidable, but has fragments which admit practical algorithms, which are used extensively in logical frameworks. For example, it is decidable whether unification problems in the pattern fragment identified by Dale Miller are solvable, and they enjoy unique mos ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. Higherorder unification is undecidable, but has fragments which admit practical algorithms, which are used extensively in logical frameworks. For example, it is decidable whether unification problems in the pattern fragment identified by Dale Miller are solvable, and they enjoy unique most general unifiers when they are. However, the restrictions that the pattern fragment imposes exclude many useful applications and encodings. One way to proceed is to use instead a more general constraint simplification algorithm that works on the parts of a unification problem that are in the pattern fragment, postponing problematic parts in the hope that later substitutions will bring them back into the pattern fragment. Such an algorithm either finds a most general solution, determines that the problem does not have a solution, or else reports a set of remaining constraints on which no further work can be done. While some constraint simplification algorithms have been proposed, their theory turns out to be surprisingly subtle — especially in the presence of dependent types, which complicate otherwise simple invariants that all equations in a unification problem are welltyped — and has, to our knowledge, not been investigated, leading to some problems with termination and completeness of implementations. This paper describes and proves correct a new, terminating constraint simplification algorithm for the dynamic pattern fragment of higherorder unification in a dependent type system. 1
Proofsearch in typetheoretic languages: an introduction
 Theoretical Computer Science
, 2000
"... We introduce the main concepts and problems in the theory of proofsearch in typetheoretic languages and survey some specific, connected topics. We do not claim to cover all of the theoretical and implementation issues in the study of proofsearch in typetheoretic languages; rather, we present som ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We introduce the main concepts and problems in the theory of proofsearch in typetheoretic languages and survey some specific, connected topics. We do not claim to cover all of the theoretical and implementation issues in the study of proofsearch in typetheoretic languages; rather, we present some key ideas and problems, starting from wellmotivated points of departure such as a definition of a typetheoretic language or the relationship between languages and proofobjects. The strong connections between different proofsearch methods in logics, type theories and logical frameworks, together with their impact on programming and implementation issues, are central in this context.
\PiCalculus with Type Similarity
, 1995
"... Motivated by the problems of the undecidablity of higherorder unification and hence the undecidability of \Piunification, Pym and Elliott give a weaker notion of typing for \Piobjects : type similarity. In this paper we present a new calculus giving a formal theory of type similarity that capture ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Motivated by the problems of the undecidablity of higherorder unification and hence the undecidability of \Piunification, Pym and Elliott give a weaker notion of typing for \Piobjects : type similarity. In this paper we present a new calculus giving a formal theory of type similarity that captures this weaker notion of typing. We then apply a variant of Hardin's interpretation method to show that welltyped terms of this new calculus are confluent and normalizing for fijreduction. This is done by defining two translations of the new calculus into the \Picalculus. We then use the results that welltyped \Piterms are confluent and strongly normalizing to get our results. 1 Introduction The \Picalculus is an extension of the simplytyped calculus with dependent function types, that is types can depend on the value of terms. As a consequence a unification procedure for the \Picalculus must unify not only terms, but also their types. To the authors' knowledge there are two presen...
Imagining CLP(Λ,≡αβ)
, 1995
"... . We study under which conditions the domain of terms () and the equality theory of the calculus (j fffi ) form the basis of a usable constraint logic programming language (CLP). The conditions are that the equality theory must contain axiom j, and the formula language must depart from Horn clause ..."
Abstract
 Add to MetaCart
. We study under which conditions the domain of terms () and the equality theory of the calculus (j fffi ) form the basis of a usable constraint logic programming language (CLP). The conditions are that the equality theory must contain axiom j, and the formula language must depart from Horn clauses and accept universal quantifications and implications in goals. In short, CLP(, j fffi ) must be close to Prolog. 1 Introduction Logic programming is a programming paradigm in which programs are logical formulas, and executing them amounts to search for a proof. The most famous practical incarnation of logic programming is Prolog, which is based on Horn formulas [31]. The formalism of Horn programs is computationally complete [1, 49], but one has often tried to augment it to gain more flexibility and expressivity. One of these attempts is the paradigm of constraint logic programming [11, 27, 10, 50]. It amounts to replacing unification of firstorder terms, considered as a procedure for s...
I R I S a
, 1994
"... : We study under which conditions the domain of terms () and the equality theory of the calculus (j fffi ) form the basis of a usable constraint logic programming language (CLP). The conditions are that the equality theory must contain axiom j, and the formula language must depart from Horn clause ..."
Abstract
 Add to MetaCart
: We study under which conditions the domain of terms () and the equality theory of the calculus (j fffi ) form the basis of a usable constraint logic programming language (CLP). The conditions are that the equality theory must contain axiom j, and the formula language must depart from Horn clauses and accept universal quantifications and implications in goals. In short, CLP(, j fffi ) must be close to Prolog. Keywords: CLP, Calculus, Prolog (R'esum'e : tsvp) ridoux@irisa.fr Centre National de la Recherche Scientifique Institut National de Recherche en Informatique (URA 227) Universite de Rennes 1  Insa de Rennes et en Automatique  unite de recherche de Rennes Imaginons CLP(,j fffi ) R'esum'e : Nous 'etudions sous quelles conditions le domaine des termes () et la th'eorie de l"egalit'e du calcul (j fffi ) forment une base utilisable pour un langage de programmation logique par contrainte (CLP). Les conditions sont que la th'eorie de l"egalit'e doit aussi contenir l'axio...
A Type Theory for Typing, MetaTyping, and Ontological Information in Discourse
"... A type theory for typing information, metatyping information and ontological information in discourse is proposed by extending Lambek calculus with higherorder dependent type theory. Furthermore, for the type theory, a discovery method of type names is also proposed using unification of type varia ..."
Abstract
 Add to MetaCart
A type theory for typing information, metatyping information and ontological information in discourse is proposed by extending Lambek calculus with higherorder dependent type theory. Furthermore, for the type theory, a discovery method of type names is also proposed using unification of type variables. The type theory can be a theoretical foundation of information extraction and learning concept's name from discourse.
AND
"... Abstract. The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed Acalculus with dependent types. Syntax is treated in a style similar to, but more general than, MartinLof’s system of a ..."
Abstract
 Add to MetaCart
Abstract. The Edinburgh Logical Framework (LF) provides a means to define (or present) logics. It is based on a general treatment of syntax, rules, and proofs by means of a typed Acalculus with dependent types. Syntax is treated in a style similar to, but more general than, MartinLof’s system of arities. The treatment of rules and proofs focuses on his notion of a judgment. Logics are represented in LF via a new principle, the judgrrzents as ~pes principle, whereby each judgment is identified with the type of its proofs. This allows for a smooth treatment of discharge and variable occurrence conditions and leads to a uniform treatment of rules and proofs whereby rules are viewed as proofs of higherorder judgments and proof checking is reduced to type checking. The practical benefit of our treatment of formal systems is that logicindependent tools,