Results 1  10
of
16
AverageCase Analysis of Algorithms and Data Structures
, 1990
"... This report is a contributed chapter to the Handbook of Theoretical Computer Science (NorthHolland, 1990). Its aim is to describe the main mathematical methods and applications in the averagecase analysis of algorithms and data structures. It comprises two parts: First, we present basic combinato ..."
Abstract

Cited by 96 (8 self)
 Add to MetaCart
This report is a contributed chapter to the Handbook of Theoretical Computer Science (NorthHolland, 1990). Its aim is to describe the main mathematical methods and applications in the averagecase analysis of algorithms and data structures. It comprises two parts: First, we present basic combinatorial enumerations based on symbolic methods and asymptotic methods with emphasis on complex analysis techniques (such as singularity analysis, saddle point, Mellin transforms). Next, we show how to apply these general methods to the analysis of sorting, searching, tree data structures, hashing, and dynamic algorithms. The emphasis is on algorithms for which exact "analytic models" can be derived.
Complexity Analysis for a Lazy HigherOrder Language
 In Proceedings of the 3rd European Symposium on Programming
, 1990
"... This paper is concerned with the timeanalysis of functional programs. Techniques which enable us to reason formally about a program's execution costs have had relatively little attention in the study of functional programming. We concentrate here on the construction of equations which compute t ..."
Abstract

Cited by 47 (2 self)
 Add to MetaCart
This paper is concerned with the timeanalysis of functional programs. Techniques which enable us to reason formally about a program's execution costs have had relatively little attention in the study of functional programming. We concentrate here on the construction of equations which compute the timecomplexity of expressions in a lazy higherorder language. The problem with higherorder functions is that complexity is dependent on the cost of applying functional parameters. Structures called costclosures are introduced to allow us to model both functional parameters and the cost of their application. The problem with laziness is that complexity is dependent on context. Projections are used to characterise the context in which an expression is evaluated, and costequations are parameterised by this contextdescription to give a compositional timeanalysis. Using this form of context information we introduce two types of timeequation: sufficienttime equations and nece...
The Random Planar Graph
 Congressus Numerantium
, 1996
"... We construct a Markov chain whose stationary distribution is uniform over all planar subgraphs of a graph. In the case of the complete graph our experiments suggest that the random simple planar graph on n vertices is connected but not 2connected and has approximately 2n edges. We present a rs ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
We construct a Markov chain whose stationary distribution is uniform over all planar subgraphs of a graph. In the case of the complete graph our experiments suggest that the random simple planar graph on n vertices is connected but not 2connected and has approximately 2n edges. We present a rst attack on the problem of describing what the random planar graph looks like.
LambdaUpsilonOmega: An Assistant Algorithms Analyzer
, 1989
"... . LambdaUpsilonOmega, \Upsilon\Omega , is a system designed to perform automatic analysis of welldefined classes of algorithms operating over "decomposable" data structures. It consists of an `Algebraic Analyzer' System that compiles algorithms specifications into generating functions of averag ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
. LambdaUpsilonOmega, \Upsilon\Omega , is a system designed to perform automatic analysis of welldefined classes of algorithms operating over "decomposable" data structures. It consists of an `Algebraic Analyzer' System that compiles algorithms specifications into generating functions of average costs, and an `Analytic Analyzer' System that extracts asymptotic informations on coefficients of generating functions. The algebraic part relies on recent methodologies in combinatorial analysis based on systematic correspondences between structural type definitions and counting generating functions. The analytic part makes use of partly classical and partly new correspondences between singularities of analytic functions and the growth of their Taylor coefficients. The current version \Upsilon\Omega 0 of \Upsilon\Omega implements as basic data types, term trees as encountered in symbolic algebra systems. The analytic analyzer can treat large classes of functions with explicit expressio...
1987]. \Analytic Variations on the Common Subexpression Problem
 Proceedings of the 17th Annual International Colloquium on Automata, Languages, and Programming (ICALP
, 1990
"... Abstract. Any tree can be represented in a max/ma//y compact form as a directed acyclic graph where common subtrees are factored and shared, being represented only once. Such a compaction can be effected in linear time. It is used to save storage in implementations of functional programming language ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Abstract. Any tree can be represented in a max/ma//y compact form as a directed acyclic graph where common subtrees are factored and shared, being represented only once. Such a compaction can be effected in linear time. It is used to save storage in implementations of functional programming languages, as well as in symbolic manipulation and computer a/gebra systems. In compiling, the compaction problem is known as the "common subexpression problem " and it plays a central r61e in register allocation, code generation and optimisation. We establish here that, under a variety of probabilistic models, a tree of size n has a compacted form of expected size asymptotically n C.ogWi ' where the constant C is explicitly related to the type of trees to be compacted and to the statistical model reflecting tree usage. In particular the savings in storage approach 100 % on average for large structures, which overperforms the commonly used form of sharing that is restmcted to leaves (atoms). Introduction. A tree can be compacted by representing occurrences of repeated subtrees only once. In that case, several pointers will point to the representation of any common subtree,
Random Generation and Approximate Counting of Ambiguously Described Combinatorial Structures
 Proceedings of 17th Annual Symposium on Theoretical Aspects of Computer Science (STACS), number 1770 in Lecture Notes in Computer Science
, 2000
"... This paper concerns the uniform random generation and the approximate counting of combinatorial structures admitting an ambiguous description. We propose a general framework to study the complexity of these problems and present some applications to specic classes of languages. In particular, we ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
This paper concerns the uniform random generation and the approximate counting of combinatorial structures admitting an ambiguous description. We propose a general framework to study the complexity of these problems and present some applications to specic classes of languages. In particular, we give a uniform random generation algorithm for nitely ambiguous contextfree languages of the same time complexity of the best known algorithm for the unambiguous case. Other applications include a polynomial time uniform random generator and approximation scheme for the census function of (i) languages recognized in polynomial time by oneway nondeterministic auxiliary pushdown automata of polynomial ambiguity and (ii) polynomially ambiguous rational trace languages. Keywords: uniform random generation, approximate counting, contextfree languages, auxiliary pushdown automata, rational trace languages, inherent ambiguity. 1 Introduction In this work we propose a general framewo...
Structure and Hierarchy in RealTime Systems
, 2002
"... The development of digital systems is particularly challenging, if their correctness depends on the right timing of operations. One approach to enhance the reliability of such systems is modelbased development. This allows for a formal analysis throughout all stages of design. Modelbased ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The development of digital systems is particularly challenging, if their correctness depends on the right timing of operations. One approach to enhance the reliability of such systems is modelbased development. This allows for a formal analysis throughout all stages of design. Modelbased
Operated semigroups, Motzkin paths and rooted trees
 J. Algebraic Combinatorics
"... Abstract. Combinatorial objects such as rooted trees that carry a recursive structure have found important applications recently in both mathematics and physics. We put such structures in an algebraic framework of operated semigroups. This framework provides the concept of operated semigroups with i ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. Combinatorial objects such as rooted trees that carry a recursive structure have found important applications recently in both mathematics and physics. We put such structures in an algebraic framework of operated semigroups. This framework provides the concept of operated semigroups with intuitive and convenient combinatorial descriptions, and at the same time endows the familiar combinatorial objects with a precise algebraic interpretation. As an application, we obtain constructions of free RotaBaxter algebras in terms of Motzkin paths and rooted trees.
Heuristics for Hierarchical Partitioning with Application to Model Checking
 Department of Computer Science, University of Aarhus
, 2000
"... . Given a collection of connected components, it is often desired to cluster together parts of strong correspondence, yielding a hierarchical structure. We address the automation of this process and apply heuristics to battle the combinatorial and computational complexity. We define a cost function ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
. Given a collection of connected components, it is often desired to cluster together parts of strong correspondence, yielding a hierarchical structure. We address the automation of this process and apply heuristics to battle the combinatorial and computational complexity. We define a cost function that captures the quality of a structure relative to the connections and favors shallow structures with a low degree of branching. Finding a structure with minimal cost is NP complete. We present a greedy polynomialtime algorithm that approximates good solutions incrementally by local evaluation of a heuristic function. We argue for a heuristic function based on four criteria: the number of enclosed connections, the number of components, the number of touched connections and the depth of the structure. We report on an application in the context of formal verification, where our algorithm serves as a preprocessor for a temporal scaling technique, called "Next" heuristic [2]. The latter is applicable in reachability analysis and is included in a recent version of the Mocha model checking tool. We demonstrate performance and benefits of our method and use an asynchronous parity computer and an opinion poll protocol as case studies. 1