Results 11  20
of
28
One Binary Horn Clause is Enough
 PROCEEDINGS OF THE SYMPOSIUM ON THEORETICAL ASPECTS OF COMPUTER SCIENCE
, 1994
"... This paper proposes an equivalent form of the famous BöhmJacopini theorem for declarative languages. C. Böhm and G. Jacopini [1] proved that all programming can be done with at most one single whiledo. That result ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper proposes an equivalent form of the famous BöhmJacopini theorem for declarative languages. C. Böhm and G. Jacopini [1] proved that all programming can be done with at most one single whiledo. That result
Effectiveness
, 2011
"... We describe axiomatizations of several aspects of effectiveness: effectiveness of transitions; effectiveness relative to oracles; and absolute effectiveness, as posited by the ChurchTuring Thesis. ..."
Abstract
 Add to MetaCart
We describe axiomatizations of several aspects of effectiveness: effectiveness of transitions; effectiveness relative to oracles; and absolute effectiveness, as posited by the ChurchTuring Thesis.
Universality and Semicomputability for Nondeterministic Programming Languages over Abstract Algebras
, 2006
"... The Universal Function Theorem (UFT) originated in 1930s with the work of Alan Turing, who proved the existence of a universal Turing machine for computations on strings over a finite alphabet. This stimulated the development of storedprogram computers. Classical computability theory, including the ..."
Abstract
 Add to MetaCart
The Universal Function Theorem (UFT) originated in 1930s with the work of Alan Turing, who proved the existence of a universal Turing machine for computations on strings over a finite alphabet. This stimulated the development of storedprogram computers. Classical computability theory, including the UFT and the theory of semicomputable sets, has been extended by Tucker and Zucker to abstract manysorted algebras, with algorithms formalized as deterministic While programs. This paper investigates the extension of this work to the nondeterministic programming languages While RA consisting of While programs extended by random assignments, as well as sublanguages of While RA formed by restricting the random assignments to booleans or naturals only. It also investigates the nondeterministic language GC of guarded commands. There are two topics algebras in these languages; (2) concepts of semicomputability for these languages, and the extent to which they coincide with semicomputability for the deterministic While language. data types, abstract computability, random assignments, guarded commands, nondeterminism.
Causal commutative arrows
"... Arrows are a popular form of abstract computation. Being more general than monads, they are more broadly applicable, and, in particular, are a good abstraction for signal processing and dataflow computations. Most notably, arrows form the basis for a domainspecific language called Yampa, which has ..."
Abstract
 Add to MetaCart
Arrows are a popular form of abstract computation. Being more general than monads, they are more broadly applicable, and, in particular, are a good abstraction for signal processing and dataflow computations. Most notably, arrows form the basis for a domainspecific language called Yampa, which has been used in a variety of concrete applications, including animation, robotics, sound synthesis, control systems, and graphical user interfaces. Our primary interest is in better understanding the class of abstract computations captured by Yampa. Unfortunately, arrows are not concrete enough to do this with precision. To remedy this situation, we introduce the concept of commutative arrows that capture a noninterference property of concurrent computations. We also add an init operator that captures the causal nature of arrow effects, and identify its associated law. To study this class of computations in more detail, we define an extension to arrows called causal commutative arrows (CCA), and study its properties. Our key contribution is the identification of a normal form for CCA called causal commutative normal form (CCNF). By defining a normalization procedure, we have developed an optimization strategy that yields dramatic improvements in performance over conventional implementations of arrows. We have implemented this technique in Haskell, and conducted benchmarks that validate the effectiveness of our approach. When compiled with the Glasgow Haskell Compiler (GHC), the overall methodology can result in significant speedups. 1
PREPRINT Authors ’ information
"... Why would you want to read this chapter? This chapter is about how to better understand the dynamics of computer models using both simulation and mathematical analysis. Our starting point is a computer model which is already implemented and ready to be run; our objective is to gain a thorough unders ..."
Abstract
 Add to MetaCart
Why would you want to read this chapter? This chapter is about how to better understand the dynamics of computer models using both simulation and mathematical analysis. Our starting point is a computer model which is already implemented and ready to be run; our objective is to gain a thorough understanding of its dynamics. This chapter shows how computer simulation and mathematical analysis can be used together to provide a picture of the dynamics of the This chapter shows how computer simulation and mathematical analysis can be used together to understand the dynamics of computer models. For this purpose, we show that it is useful to see the computer model as a particular implementation of a formal model in a certain programming language. This formal model is the abstract entity which is defined by the inputoutput relation that the computer model executes, and can be seen as a function that transforms probability distributions over the set of possible inputs into probability distributions over the set of possible outputs. It is shown here that both computer simulation and mathematical analysis are extremely
TuringCompleteness of Polymorphic Stream Equation Systems
"... Polymorphic stream functions operate on the structure of streams, infinite sequences of elements, without inspection of the contained data, having to work on all streams over all signatures uniformly. A natural, yet restrictive class of polymorphic stream functions comprises those definable by a sys ..."
Abstract
 Add to MetaCart
Polymorphic stream functions operate on the structure of streams, infinite sequences of elements, without inspection of the contained data, having to work on all streams over all signatures uniformly. A natural, yet restrictive class of polymorphic stream functions comprises those definable by a system of equations using only stream constructors and destructors and recursive calls. Using methods reminiscent of prior results in the field, we first show this class consists of exactly the computable polymorphic stream functions. Using much more intricate techniques, our main result states this holds true even for unary equations free of mutual recursion, yielding an elegant model of Turingcompleteness in a severely restricted environment and allowing us to recover previous complexity results in a much more restricted setting.
doi:10.1093/comjnl/bxh093 BranchCoverage Testability Transformation for Unstructured Programs
, 2005
"... Test data generation by hand is a tedious, expensive and errorprone activity, yet testing is a vital part of the development process. Several techniques have been proposed to automate the generation of test data, but all of these are hindered by the presence of unstructured control flow. This paper ..."
Abstract
 Add to MetaCart
Test data generation by hand is a tedious, expensive and errorprone activity, yet testing is a vital part of the development process. Several techniques have been proposed to automate the generation of test data, but all of these are hindered by the presence of unstructured control flow. This paper addresses the problem using testability transformation. Testability transformation does not preserve the traditional meaning of the program, rather it deals with preserving testadequate sets of input data. This requires new equivalence relations which, in turn, entail novel proof obligations. The paper illustrates this using the branch coverage adequacy criterion and develops a branch adequacy equivalence relation and a testability transformation for restructuring. It then presents a proof that the transformation preserves branch adequacy. 1.