Results 1 
9 of
9
MembershipConstraints and Complexity in Logic Programming with Sets
 Frontiers in Combining Systems
, 1996
"... . General agreement exists about the usefulness of sets as very highlevel representations of complex data structures. Therefore it is worthwhile to introduce sets into constraint logic programming or set constraints into programming languages in general. We start with a brief overview on different n ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
(Show Context)
. General agreement exists about the usefulness of sets as very highlevel representations of complex data structures. Therefore it is worthwhile to introduce sets into constraint logic programming or set constraints into programming languages in general. We start with a brief overview on different notions of sets. This seems to be necessary since there are almost as many different notions in the field as there are applications such as e.g. program analysis, rapid software prototyping, unificationbased grammar formalisms. An efficient algorithm for treating membershipconstraints is introduced. It is used in the implementation of an algorithm for unifying finite sets with tails  also presented here  which is needed in any logic programming language embedding sets. Finally it is shown how a full set language including the operators 2, = 2, ", [ can be built on membershipconstraints. The text closes with a reflection on the complexity of different algorithms  which is single expone...
A Minimality Study for Set Unification
, 1997
"... A unification algorithm is said to be minimal for a unification problem if it generates exactly a (minimal) complete set of mostgeneral unifiers, without instances, and without repetitions. The aim of this paper is to present a combinatorial minimality study for a significant collection of sample p ..."
Abstract

Cited by 10 (7 self)
 Add to MetaCart
A unification algorithm is said to be minimal for a unification problem if it generates exactly a (minimal) complete set of mostgeneral unifiers, without instances, and without repetitions. The aim of this paper is to present a combinatorial minimality study for a significant collection of sample problems that can be used as benchmarks for testing any setunification algorithm. Based on this combinatorial study, a new SetUnification Algorithm (named SUA) is also described and proved to be minimal for all the analyzed problems. Furthermore, an existing nave setunification algorithm has also been tested to show its bad behavior for most of the sample problems.
Computing Overlappings by Unification in the Deterministic Lambda Calculus LR with letrec, case, constructors, seq and variable chains
, 2011
"... Abstract. We investigate the possibilities to automatize correctness proofs of program transformations in an extended lambda calculus LR. The calculus is equipped with an operational semantics, a standardized form of evaluation and based on that a notion of contextual equivalence which is used to de ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. We investigate the possibilities to automatize correctness proofs of program transformations in an extended lambda calculus LR. The calculus is equipped with an operational semantics, a standardized form of evaluation and based on that a notion of contextual equivalence which is used to define when a program transformations is considered as correct. A successful approach to proving correctness of program transformations is the combination of a context lemma with the computation of overlaps between program transformations and the reduction rules. The method is similar to the computation of critical pairs for the completion of term rewriting systems. We describe an effective unification algorithm to determine all overlaps of transformations with reduction rules for the lambda calculus LR which comprises a recursive letexpressions, constructor applications, case expressions and a seq construct for strict evaluation. The unification algorithm uses manysorted terms, the equational theory of leftcommutativity to model multisets, context variables of different kinds and a mechanism for compactly representing binding chains in recursive letexpressions. The algorithm computes a finite set of overlappings for the reduction rules of the calculus LR that serve as a starting point for the automation of the analysis of program transformations. This author is supported by the DFG under grant SCHM 986/91.2 C. Rau and M. SchmidtSchauß
Towards Correctness of Program Transformations Through Unification and Critical Pair Computation
, 2010
"... Abstract. Correctness of program transformations in extended lambdacalculi with a contextual semantics is usually based on reasoning about the operational semantics which is a rewrite semantics. A successful approach is the combination of a context lemma with the computation of overlaps between pro ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. Correctness of program transformations in extended lambdacalculi with a contextual semantics is usually based on reasoning about the operational semantics which is a rewrite semantics. A successful approach is the combination of a context lemma with the computation of overlaps between program transformations and the reduction rules, which results in socalled complete sets of diagrams. The method is similar to the computation of critical pairs for the completion of term rewriting systems. We explore cases where the computation of these overlaps can be done in a first order way by variants of critical pair computation that use unification algorithms. As a case study of an application we describe a finitary and decidable unification algorithm for the combination of the equational theory of leftcommutativity modelling multisets, context variables and manysorted unification. Sets of equations are restricted to be almost linear, i.e. every variable and context variable occurs at most once, where we allow one exception: variables of a sort without ground terms may occur several times. Every context variable must have an argumentsort in the free part of the signature. We also extend the unification algorithm by the treatment of bindingchains in let and letrecenvironments and by contextclasses. This results in a unification algorithm that can be applied to all overlaps of normalorder reductions and transformations in an extended lambda calculus with letrec that we use as a case study. 1
Unification of Bounded Simple Set Terms in Deductive Databases
, 1996
"... In this paper we consider the problem of unification of bounded simple set terms in the field of deductive databases. Simple set terms are of the form fe 1 ; : : : ; e ng, where e i is a constant or a variable and are much used in deductive database systems such as LDL and Coral. In this paper we c ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper we consider the problem of unification of bounded simple set terms in the field of deductive databases. Simple set terms are of the form fe 1 ; : : : ; e ng, where e i is a constant or a variable and are much used in deductive database systems such as LDL and Coral. In this paper we consider a restricted form of unification, called "weak unification", which is mainly used in the field of deductive databases where the database may contain both constants and variables and the program is "safe". The main results are: (a) the detailed complexity analysis of the weak unification problem by providing a formula for determining the number of weak unifiers, and (b) the invention of an optimal weak unification algorithm. 1 Introduction Several recent papers have addressed the problem of extending logic programming and deductive database languages with sets in order to handle aggregation of objects [7, 22, 1, 2, 15]. In this paper we shall refer to the extensions proposed by [7], ...
multisets, compact lists, and sets
, 2006
"... uniform approach to constraintsolving for lists, ..."
(Show Context)
AND
"... The first order theories of lists, bags, compactlists (i.e., lists where the number of contiguous occurrences of each element is immaterial), and sets are introduced via axioms. Such axiomatizations are shown to be especially suitable for the integration with free functor symbols governed by the cl ..."
Abstract
 Add to MetaCart
The first order theories of lists, bags, compactlists (i.e., lists where the number of contiguous occurrences of each element is immaterial), and sets are introduced via axioms. Such axiomatizations are shown to be especially suitable for the integration with free functor symbols governed by the classical Clark’s axioms in the context of Constraint Logic Programming. Adaptations of the extensionality principle to the various theories taken into account is then exploited in the design of unification algorithms for the considered data structures. All the theories presented can be combined providing frameworks to deal with several of the proposed data structures simoultaneously. The unification algorithms proposed can be combined (merged) as well to produce engines for such combination theories. 1.
Graph Mining and Outlier Detection Meet Logic Proof Tutoring
"... We introduce a new method for analysis and evaluation of logic proofs constructed by undergraduate students, e.g. resolution or tableaux proofs. This method employs graph mining and outlier detection. The data has been obtained from a webbased system for input of logic proofs built at FI MU. The da ..."
Abstract
 Add to MetaCart
(Show Context)
We introduce a new method for analysis and evaluation of logic proofs constructed by undergraduate students, e.g. resolution or tableaux proofs. This method employs graph mining and outlier detection. The data has been obtained from a webbased system for input of logic proofs built at FI MU. The data contains a tree structure of the proof and also temporal information about all actions that a student performed, e.g. a node insertion into a proof, or its deletion, drawing or deletion of an edge, or text manipulations. We introduce a new method for multilevel generalization of subgraphs that is useful for characterization of logic proofs. We use this method for feature construction and perform classbased outlier detection on logic proofs represented by these new features. We show that this method helps to find unusual students ’ solutions and to improve semiautomatic evaluation of the solutions. Keywords logic proofs, resolution, educational data mining, graph mining, outlier detection 1.