Results 11  20
of
27
Generalised Constraint Propagation Over the CLP Scheme
 Journal of Logic Programming
, 1992
"... Constraint logic programming is often described as logic programming with unification replaced by constraint solving over a computation domain. There is another, very different, CLP paradigm based on constraint satisfaction, where programdefined goals can be treated as constraints and handled using ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
Constraint logic programming is often described as logic programming with unification replaced by constraint solving over a computation domain. There is another, very different, CLP paradigm based on constraint satisfaction, where programdefined goals can be treated as constraints and handled using propagation. This paper proposes a generalisation of propagation, which enables it to be applied on arbitrary computation domains, revealing that the two paradigms of CLP are orthogonal, and can be freely combined. The main idea behind generalised propagation is to use whatever constraints are available over the computation domain to express restrictions on problem variables. Generalised propagation on a goal G requires that the system extracts a constraint approximating all the answers to G. The paper introduces a generic algorithm for generalised propagation called topological branch and bound which avoids enumerating all the answers to G. Generalised propagation over the Herbrand univers...
Offline Constraint Propagation for Efficient HPSG Processing
, 1996
"... Introduction A major goal of a linguist writing hpsg theories is to express very general constraints to capture linguistic phenomena, leaving as much as possible underspecified. When such a hpsg theory is implemented faithfully, either processing is inefficient because only little information is av ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Introduction A major goal of a linguist writing hpsg theories is to express very general constraints to capture linguistic phenomena, leaving as much as possible underspecified. When such a hpsg theory is implemented faithfully, either processing is inefficient because only little information is available to guide the constraint resolution process, or the linguistic theory is annotated with information to guide processing. Usually such annotations are provided manually  a very time consuming and errorprone process which can change the original linguistic theory. In this paper we show that it is possible to automatically make a theory more specific at those places where linguistically motivated underspecification would lead to inefficient processing. The authors are listed alphabetically. Authors' respective addresses: Seminar fur Sprachwissenschaft, Universitat Tubingen, Kl. Wilhelmstr. 113, D72074 Tubingen, Germany; dm@sfs.nphil.un
A Computational Treatment of Lexical Rules in HPSG as Covariation in Lexical Entries
 Computational Linguistics
, 1997
"... This paper proposes a new computational treatment of lexical rules as used in the HPSG framework. A compiler is described which translates a set of lexical rules and their interaction into a definite clause encoding, which is called by the base lexical entries in the lexicon. This way, the disjunc ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
This paper proposes a new computational treatment of lexical rules as used in the HPSG framework. A compiler is described which translates a set of lexical rules and their interaction into a definite clause encoding, which is called by the base lexical entries in the lexicon. This way, the disjunctive possibilities arising from lexical rule application are encoded as systematic covariation in the specification of lexical entries. The compiler ensures the automatic transfer of properties not changed by a lexical rule. Program transformation techniques are used to advance the encoding. The final output of the compiler constitutes an efficient computational counterpart of the linguistic generalizations captured by lexical rules and allows onthefly application of lexical rules
Termination of Logic Programs with block Declarations Running in Several Modes
 Proceedings of the 10th Symposium on Programming Language Implementations and Logic Programming, LNCS
, 1998
"... We show how termination of logic programs with delay declarations can be proven. Three features are distinctive of this work: (a) we assume that predicates can be used in several modes; (b) we show that block declarations, which are a very simple delay construct, are sufficient; (c) we take the sele ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
We show how termination of logic programs with delay declarations can be proven. Three features are distinctive of this work: (a) we assume that predicates can be used in several modes; (b) we show that block declarations, which are a very simple delay construct, are sufficient; (c) we take the selection rule into account, assuming it to be as in most Prolog implementations. Our method is based on identifying the socalled robust predicates, for which the textual position of an atom using this predicate is irrelevant. The method can be used to verify existing programs, and to assist in writing new programs. As a byproduct, we also show how programs can be proven to be free from occurcheck and floundering.
Abstract conjunctive partial deduction using regular types and its application to model checking
 In Proc. of LOPSTR, number 2372 in LNCS
, 2001
"... Abstract. We present an abstract partial deduction technique which uses regular types as its domain and which can handle conjunctions, and thus perform deforestation and tupling. We provide a detailed description of all the required operations and present an implementation within the ecce system. We ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Abstract. We present an abstract partial deduction technique which uses regular types as its domain and which can handle conjunctions, and thus perform deforestation and tupling. We provide a detailed description of all the required operations and present an implementation within the ecce system. We discuss the power of this new specialisation algorithm, especially in the light of verifying and specialising infinite state process algebras. Here, our new algorithm can provide a more precise treatment of synchronisation and can be used for refinement checking. 1
Bottom Up Information Propagation for Partial Deduction (Extended Abstract)
 Proceedings of the International Workshop on Specialization of Declarative Programs and its Applications
, 1997
"... ) Wim Vanhoof Departement Computerwetenschappen Katholieke Universiteit Leuven Celestijnenlaan 200A, B3001 Heverlee, Belgium email: wimvh@cs.kuleuven.ac.be Tel: +32 16 327638, Fax: +32 16 327996 Abstract Traditional top down specialisation techniques are not optimal to specialise programs cont ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
) Wim Vanhoof Departement Computerwetenschappen Katholieke Universiteit Leuven Celestijnenlaan 200A, B3001 Heverlee, Belgium email: wimvh@cs.kuleuven.ac.be Tel: +32 16 327638, Fax: +32 16 327996 Abstract Traditional top down specialisation techniques are not optimal to specialise programs containing a lot of internal structure handling, such as programs containing abstract data types or meta programs. In this abstract, we discuss the difficulties top down specialisers have when specialising these kinds of programs. The difficulties arise from the fact that unfolding works in a top down way, whereas some information flows in a bottom up way. Therefore, top down techniques must be able to unfold deeply enough to reach the necessary information, and subsequently be able to keep it during specialisation. We therefore propose a program transformation phase, targeted to be interleaved with a completely general and fully automatic top down partial deduction scheme, which consists of pro...
Partial Deduction System
 In Proc. of the ILPS'97 Workshop on Tools and Environments for (Constraint) Logic Programming, U.P
, 1997
"... We present the fully automatic partial deduction system ecce, which can be used to specialise and optimise logic programs. We describe the underlying principles of ecce and illustrate some of the potential application areas. Interesting possibilites of crossfertilisation with other fields such as r ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
We present the fully automatic partial deduction system ecce, which can be used to specialise and optimise logic programs. We describe the underlying principles of ecce and illustrate some of the potential application areas. Interesting possibilites of crossfertilisation with other fields such as reachability analysis of concurrent systems and inductive theorem proving are highlighted and substantiated. 1 Introduction Program specialisation, also called partial evaluation or partial deduction, is an automatic technique for program optimisation. The central idea is to specialise a given source program for a particular application domain. Program specialisation encompasses traditional compiler optimisation techniques, such as constant folding and inlining, but uses more aggressive transformations, yielding both the possibility of obtaining (much) greater speedups and more difficulty in controlling the transformation process. In addition to achieving important speedups, program special...
Polynomial Time Inference of A Subclass of Contextfree Transformations
 In Proceedings of the Fifth Workshop on Computational Learning Theory (COLT92
"... This paper deals with a class of Prolog programs, called contextfree term transformations (CFT). We present a polynomial time algorithm to identify a subclass of CFT, whose program consists of at most two clauses, from positive data; The algorithm uses 2mmg (2minimal multiple generalization) algor ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
This paper deals with a class of Prolog programs, called contextfree term transformations (CFT). We present a polynomial time algorithm to identify a subclass of CFT, whose program consists of at most two clauses, from positive data; The algorithm uses 2mmg (2minimal multiple generalization) algorithm, which is a natural extension of Plotkin's least generalization algorithm, to reconstruct the pair of heads of the unknown program. Using this algorithm, we show the consistent and conservative polynomial time identifiability of the class of tree languages defined by CFTFB uniq together with tree languages defined by pairs of two tree patterns, both of which are proper subclasses of CFT, in the limit from positive data. 1 Introduction The problem considered in this paper is, given an infinite sequence of facts which are true in the unknown model, to identify a Prolog program P that defines the unknown model M in the limit. We deal with the class CFTFB uniq of contextfree term transfor...