Results 11  20
of
61
Algebra of logic programming
 International Conference on Logic Programming
, 1999
"... At present, the field of declarative programming is split into two main areas based on different formalisms; namely, functional programming, which is based on lambda calculus, and logic programming, which is based on firstorder logic. There are currently several language proposals for integrating th ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
At present, the field of declarative programming is split into two main areas based on different formalisms; namely, functional programming, which is based on lambda calculus, and logic programming, which is based on firstorder logic. There are currently several language proposals for integrating the expressiveness of these two models of computation. In this thesis we work towards an integration of the methodology from the two research areas. To this end, we propose an algebraic approach to reasoning about logic programs, corresponding to the approach taken in functional programming. In the first half of the thesis we develop and discuss a framework which forms the basis for our algebraic analysis and transformation methods. The framework is based on an embedding of definite logic programs into lazy functional programs in Haskell, such that both the declarative and the operational semantics of the logic programs are preserved. In spite of its conciseness and apparent simplicity, the embedding proves to have many interesting properties and it gives rise to an algebraic semantics of logic programming. It also allows us to reason about logic programs in a simple calculational style, using rewriting and the algebraic laws of combinators. In the embedding, the meaning of a logic program arises compositionally from the meaning of its constituent subprograms and the combinators that connect them. In the second half of the thesis we explore applications of the embedding to the algebraic transformation of logic programs. A series of examples covers simple program derivations, where our techniques simplify some of the current techniques. Another set of examples explores applications of the more advanced program development techniques from the Algebra of Programming by Bird and de Moor [18], where we expand the techniques currently available for logic program derivation and optimisation. To my parents, Sandor and Erzsebet. And the end of all our exploring Will be to arrive where we started And know the place for the first time.
First Class Patterns
 In 2nd International Workshop on Practial Aspects of Declarative Languages, volume 1753 of LNCS
, 2000
"... . Pattern matching is a great convenience in programming. However, pattern matching has its problems: it conicts with data abstraction; it is complex (at least in Haskell, which has pattern guards, irrefutable patterns, n+k patterns, as patterns, etc.); it is a source of runtime errors; and lastly, ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
. Pattern matching is a great convenience in programming. However, pattern matching has its problems: it conicts with data abstraction; it is complex (at least in Haskell, which has pattern guards, irrefutable patterns, n+k patterns, as patterns, etc.); it is a source of runtime errors; and lastly, one cannot abstract over patterns as they are not a rst class language construct. This paper proposes a simplication of pattern matching that makes patterns rst class. The key idea is to treat patterns as functions of type a!Maybe bi.e., a!(NothingJust b); thus, patterns and pattern combinators can be written as functions in the language. 1 Introduction A hotly debated issue in the language Haskell [HJW92] has been patterns. What are their semantics? Do we want n+1 patterns? Do we need @patterns? When do we match lazily and when do we match strictly? Do we need to extend patterns with pattern guards? And etc. In this paper I will propose, not another extension, but a simplicat...
Embedding prolog in haskell
 Department of Computer Science, University of Utrecht
, 1999
"... The distinctive merit of the declarative reading of logic programs is the validity ofallthelaws of reasoning supplied by the predicate calculus with equality. Surprisingly many of these laws are still valid for the procedural reading � they can therefore be used safely for algebraic manipulation, pr ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
The distinctive merit of the declarative reading of logic programs is the validity ofallthelaws of reasoning supplied by the predicate calculus with equality. Surprisingly many of these laws are still valid for the procedural reading � they can therefore be used safely for algebraic manipulation, program transformation and optimisation of executable logic programs. This paper lists a number of common laws, and proves their validity for the standard (depth rst search) procedural reading of Prolog. They also hold for alternative search strategies, e.g. breadth rst search. Our proofs of the laws are based on the standard algebra of functional programming, after the strategies have been given a rather simple implementation in Haskell. 1
Variabilityaware parsing in the presence of lexical macros and conditional compilation
 In Proc. 2011 ACM Conference on ObjectOriented Programming Systems, Languages, and Applications
, 2011
"... In many projects, lexical preprocessors are used to manage different variants of the project (using conditional compilation) and to define compiletime code transformations (using macros). Unfortunately, while being a simple way to implement variability, conditional compilation and lexical macros hi ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
In many projects, lexical preprocessors are used to manage different variants of the project (using conditional compilation) and to define compiletime code transformations (using macros). Unfortunately, while being a simple way to implement variability, conditional compilation and lexical macros hinder automatic analysis, even though such analysis is urgently needed to combat variabilityinduced complexity. To analyze code with its variability, we need to parse it without preprocessing it. However, current parsing solutions use unsound heuristics, support only a subset of the language, or suffer from exponential explosion. As part of the TypeChef project, we contribute a novel variabilityaware parser that can parse almost all unpreprocessed code without heuristics in practicable time. Beyond the obvious task of detecting syntax errors, our parser paves the road for further analysis, such as variabilityaware type checking. We implement variabilityaware parsers for Java and GNU C and demonstrate practicability by parsing the product line MobileMedia and the entire X86 architecture of the Linux kernel with 6065 variable features.
The integration of functions into logic programming
 The Journal of Logic Programming
, 1994
"... This paper presents a new program analysis framework to approximate call patterns and their results in functional logic computations. We consider programs containing nonstrict, nondeterministic operations in order to make the analysis applicable to modern functional logic languages like Curry or TO ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
This paper presents a new program analysis framework to approximate call patterns and their results in functional logic computations. We consider programs containing nonstrict, nondeterministic operations in order to make the analysis applicable to modern functional logic languages like Curry or TOY. For this purpose, we present a new fixpoint characterization of functional logic computations w.r.t. a set of initial calls. We show how programs can be analyzed by approximating this fixpoint. The results of such an approximation have various applications, e.g., program optimization as well as verifying safety properties of programs. 1
Parser combinators for ambiguous leftrecursive grammars
 In ????, pages
, 2007
"... Abstract. Parser combinators are higherorder functions used to build parsers as executable specifications of grammars. Some existing implementations are only able to handle limited ambiguity, some have exponential time and/or space complexity for ambiguous input, most cannot accommodate leftrecurs ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Abstract. Parser combinators are higherorder functions used to build parsers as executable specifications of grammars. Some existing implementations are only able to handle limited ambiguity, some have exponential time and/or space complexity for ambiguous input, most cannot accommodate leftrecursive grammars. This paper describes combinators, implemented in Haskell, which overcome all of these limitations.
Functional pearl: I am not a number–I am a free variable
 In Proc. Haskell workshop
"... In this paper, we show how to manipulate syntax with binding using a mixed representation of names for free variables (with respect to the task in hand) and de Bruijn indices [5] for bound variables. By doing so, we retain the advantages of both representations: naming supports easy, arithmeticfree ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
In this paper, we show how to manipulate syntax with binding using a mixed representation of names for free variables (with respect to the task in hand) and de Bruijn indices [5] for bound variables. By doing so, we retain the advantages of both representations: naming supports easy, arithmeticfree manipulation of terms; de Bruijn indices eliminate the need for αconversion. Further, we have ensured that not only the user but also the implementation need never deal with de Bruijn indices, except within key basic operations. Moreover, we give a hierarchical representation for names which naturally reflects the structure of the operations we implement. Name choice is safe and straightforward. Our technology combines easily with an approach to syntax manipulation inspired by Huet’s ‘zippers’[10]. Without the ideas in this paper, we would have struggled to implement EPIGRAM [19]. Our example—constructing inductive elimination operators for datatype families—is but one of many where it proves invaluable.
KiCS2: A New Compiler from Curry to Haskell
 IN PROC. OF THE 20TH INTERNATIONAL WORKSHOP ON FUNCTIONAL AND (CONSTRAINT) LOGIC PROGRAMMING (WFLP 2011
, 2011
"... In this paper we present our first steps towards a new system to compile functional logic programs of the source language Curry into purely functional Haskell programs. Our implementation is based on the idea to represent nondeterministic results as values of the data types corresponding to the res ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
In this paper we present our first steps towards a new system to compile functional logic programs of the source language Curry into purely functional Haskell programs. Our implementation is based on the idea to represent nondeterministic results as values of the data types corresponding to the results. This enables the application of various search strategies to extract values from the search space. We show by several benchmarks that our implementation can compete with or outperform other existing implementations of Curry.