Results 1  10
of
33
Data structure as topological spaces
 In Proceedings of the 3nd International Conference on Unconventional Models of Computation UMC02
, 2002
"... Abstract. In this paper, we propose a topological metaphor for computations: computing consists in moving through a path in a data space and making some elementary computations along this path. This idea underlies an experimental declarative programming language called mgs. mgs introduces the notion ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we propose a topological metaphor for computations: computing consists in moving through a path in a data space and making some elementary computations along this path. This idea underlies an experimental declarative programming language called mgs. mgs introduces the notion of topological collection: a set of values organized by a neighborhood relationship. The basic computation step in mgs relies on the notion of path: a path C is substituted for a path B in a topological collection A. This step is called a transformation and several features are proposed to control the transformation applications. By changing the topological structure of the collection, the underlying computational model is changed. Thus, mgs enables a unified view on several computational mechanisms. Some of them are initially inspired by biological or chemical processes (Gamma and the CHAM, Lindenmayer systems, Paun systems and cellular automata).
Extent Analysis of Data Fields
, 1994
"... Data parallelism means operating on distributed tables, data fields, in parallel. An abstract model of data parallelism treats data fields as functions explicitly restricted to a finite set. Data parallel functional languages based on this view vill reach a very high level of abstraction. In this re ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
Data parallelism means operating on distributed tables, data fields, in parallel. An abstract model of data parallelism treats data fields as functions explicitly restricted to a finite set. Data parallel functional languages based on this view vill reach a very high level of abstraction. In this report we consider two static analyses that, when successful, give information about the extent of a recursively defined data field. This information can be used to preallocate the data fields and map then efficiently to distributed memory, and to aid the static scheduling of operations. The analyses are cast in the framework of abstract interpretation: a forward analaysis propagates restrictions on inputs to restrictions on outputs, and a backward analysis propagates restrictions the other way. Fixpoint iteration can sometimes be used to solve the equations that arise, and we devise some cases where this is possible.
A case study: Effects of WITHloopfolding on the NAS Benchmark MG in SAC
 Proceedings of IFL `98, LNCS 1595
, 1999
"... Sac is a functional C variant with efficient support for highlevel array operations. This paper investigates the applicability of a Sac specific optimization technique called withloopfolding to real world applications. As an example program which originates from the Numerical Aerodynamic Simula ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
Sac is a functional C variant with efficient support for highlevel array operations. This paper investigates the applicability of a Sac specific optimization technique called withloopfolding to real world applications. As an example program which originates from the Numerical Aerodynamic Simulation (NAS) Program developed at NASA Ames Research Center, the socalled NAS benchmark MG is chosen. It comprises a kernel from the NAS Program which implements 3dimensional multigrid relaxation. Several runtime measurements exploit two different benefits of withloopfolding: First, an overall speedup of about 20 % can be observed. Second, a comparison between the runtimes of a handoptimized specification and of Apllike specifications yields identical runtimes, although a naive compilation that does not apply withloopfolding leads to slowdowns of more than an order of magnitude. Furthermore, Withloopfolding makes a slight variation of the algorithm feasible which substantially simplifies the program specification and requires less memory during execution. Finally, the optimized runtimes are compared against runtimes gained from the original Fortran program, which shows that for different problem sizes, the code generated from the Sac program does not only reach the execution times of the code generated from the Fortran program but even outperforms them by about 10%.
Topological collections, transformations and their application to the modeling and the simulation of dynamical systems
 In: Rewriting Technics and Applications (RTA’03), LNCS, vol. LNCS 2706
, 2003
"... I take the opportunity given by this invited talk to promote two ideas: (1) a topological point of view can fertilize the notion of rewriting and (2) this topological approach of rewriting is at the core of the modeling and the simulation of an emerging class of dynamical systems (DS): the DS that ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
I take the opportunity given by this invited talk to promote two ideas: (1) a topological point of view can fertilize the notion of rewriting and (2) this topological approach of rewriting is at the core of the modeling and the simulation of an emerging class of dynamical systems (DS): the DS that exhibit a dynamical
A Transformational Approach which Combines Size Inference and Program Optimization
 SEMANTICS, APPLICATIONS, AND IMPLEMENTATION OF PROGRAM GENERATION (SAIG’01), LECTURE NOTES IN COMPUTER SCIENCE 2196
, 2001
"... ..."
(Show Context)
Data Parallel Programming: A Survey and a Proposal for a New Model
, 1993
"... We give a brief description of what we consider to be data parallel programming and processing, trying to pinpoint the typical problems and pitfalls that occur. We then proceed with a short annotated history of data parallel programming, and sketch a taxonomy in which data parallel languages can be ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We give a brief description of what we consider to be data parallel programming and processing, trying to pinpoint the typical problems and pitfalls that occur. We then proceed with a short annotated history of data parallel programming, and sketch a taxonomy in which data parallel languages can be classified. Finally we present our own model of data parallel programming, which is based on the view of parallel data collections as functions. We believe that this model has a number of distinct advantages, such as being abstract, independent of implicitly assumed machine models, and general.
Data Fields
 In Proc. Workshop on Generic Programming, Marstrand
, 1998
"... This position paper describes the data field model, a general model for indexed data structures. The aim of this model is to capture the essence of the style of programming where computing on data structures is expressed by operations directly on the structures rather than operations on the individu ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
(Show Context)
This position paper describes the data field model, a general model for indexed data structures. The aim of this model is to capture the essence of the style of programming where computing on data structures is expressed by operations directly on the structures rather than operations on the individual elements. Array and and data parallel languages support this programming style, and functional languages often provide second order operations on lists and other data structures for the same purpose. The data field model is designed to be abstract enough to encompass a wide range of explicitly or implicitly indexed structures. Thus, algorithms which are expressed in terms of data fields and general operations on them will be independent of the choice of structure from this range  i.e., generic w.r.t. this choice. This means that the data field approach has some in common with polytypic programming and the theory of shapes.
Contribution to Semantics of a DataParallel Logic Programming Language
 Post International Logic Programming Symposium Workshop on Parallel Logic Programming Systems
, 1995
"... . We propose an alternate approach to the usual introduction of parallelism in logic programming. Instead of detecting the intrinsic parallelism by an automatic and complex dataflow analysis, or upgrading standard logic languages by explicit concurrent control structures leading to taskoriented la ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
(Show Context)
. We propose an alternate approach to the usual introduction of parallelism in logic programming. Instead of detecting the intrinsic parallelism by an automatic and complex dataflow analysis, or upgrading standard logic languages by explicit concurrent control structures leading to taskoriented languages, we tightly integrate the concepts of the dataparallel programming model and of logic programming in a kernel language, called DPLog. It offers a simple centralized and synchronous vision to the programmer. We give this language a declarative and a distributed asynchronous operational semantics. The equivalence theorem of these semantics establishes the soundness of the implementation. The expressiveness of the language is illustrated on examples. Keywords : Logic programming  Dataparallel languages  Design of programming languages  Semantics  MIMD architectures Introduction The introduction of parallelism in programming languages enables to extend the expressiveness ...
Data Flow Analysis of Recursive Structures
, 1996
"... . Most imperative languages only offer arrays as "firstclass" data structures. Other data structures, especially recursive data structures such as trees, have to be manipulated using explicit control of memory, i.e., through pointers to explicitly allocated portions of memory. We believe ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
. Most imperative languages only offer arrays as "firstclass" data structures. Other data structures, especially recursive data structures such as trees, have to be manipulated using explicit control of memory, i.e., through pointers to explicitly allocated portions of memory. We believe that this severe limitation is mainly due to historical reasons, and this paper will try and demonstrate that modern analysis techniques, such as data flow analysis, allow to cope with the compilation problems associated with recursive data structures. As a matter of fact, recursion in the flow of control also is a current open issue in automatic parallelization: to our knowledge, no theory allows the parallelization of, e.g., recursive Pascal programs. This paper uniformly handles both issues. We propose a kernel language that manipulates recursive data structures in an elegant, algebraic way. In this preliminary work, both data and control recursive structures are restricted, so that a data flow a...
The Data Field Model
 Coyne R D, Rosenman M A, Radford A D, Balachandran M and Gero J S Knowledgebased
, 2001
"... Indexed data structures are prevalent in many programming applications. Collectionoriented languages provide means to operate directly on these structures, rather than having to loop or recurse through them. This style of programming will often yield clear and succinct programs. However, these prog ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Indexed data structures are prevalent in many programming applications. Collectionoriented languages provide means to operate directly on these structures, rather than having to loop or recurse through them. This style of programming will often yield clear and succinct programs. However, these programming languages will often provide only a limited choice of indexed data types and primitives, and the exact semantics of these primitives will sometimes vary with the data type and language. In this paper we develop a unifying semantical model for indexed data structures. The purpose is to support the construction of abstract data types and language features for such structures from first principles, such that they are largely generic over many kinds of data structures. The use of these abstract data types can make programs and their semantics less dependent of the actual data structure. This makes programs more portable across different architectures and facilitates the early design phase. The model is a generalisation of arrays, which we call data fields: these are functions with explicit information about their domains. This information can be conventional array bounds but it could also define other shapes, for instance sparse. Data fields can be interpreted as partial functions, and we define a metalanguage for partial functions. In this language we define abstract versions of collectionoriented operations, and we show a number of identities for them. This theory is used to guide the design of data fields and their operations so they correspond closely to the more abstract notion of partial functions. We define phiabstraction, a lambdalike syntax for defining data fields in a shapeindependent manner, and prove a theorem which relates phiabstraction and lambdaabstraction semantically. We also define a small data field language whose semantics is given by formal data fields, and give examples of data field programming for parallel algorithms with arrays and sparse structures, database quering and computing, and specification of symbolic drawings.