Results 1  10
of
10
Toward a model for backtracking and dynamic programming
 Comput. Compl
"... We propose a model called priority branching trees (pBT) for backtracking and dynamic programming algorithms. Our model generalizes both the priority model of Borodin, Nielson and Rackoff, as well as a simple dynamic programming model due to Woeginger, and hence spans a wide spectrum of algorithms. ..."
Abstract

Cited by 27 (7 self)
 Add to MetaCart
(Show Context)
We propose a model called priority branching trees (pBT) for backtracking and dynamic programming algorithms. Our model generalizes both the priority model of Borodin, Nielson and Rackoff, as well as a simple dynamic programming model due to Woeginger, and hence spans a wide spectrum of algorithms. After witnessing the strength of the model, we then show its limitations by providing lower bounds for algorithms in this model for several classical problems such as Interval Scheduling, Knapsack and Satisfiability.
A Relational Approach To Optimization Problems
, 1996
"... The main contribution of this thesis is a study of the dynamic programming and greedy strategies for solving combinatorial optimization problems. The study is carried out in the context of a calculus of relations, and generalises previous work by using a loop operator in the imperative programming s ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
The main contribution of this thesis is a study of the dynamic programming and greedy strategies for solving combinatorial optimization problems. The study is carried out in the context of a calculus of relations, and generalises previous work by using a loop operator in the imperative programming style for generating feasible solutions, rather than the fold and unfold operators of the functional programming style. The relationship between fold operators and loop operators is explored, and it is shown how to convert from the former to the latter. This fresh approach provides additional insights into the relationship between dynamic programming and greedy algorithms, and helps to unify previously distinct approaches to solving combinatorial optimization problems. Some of the solutions discovered are new and solve problems which had previously proved difficult. The material is illustrated with a selection of problems and solutions that is a mixture of old and new. Another contribution is the invention of a new calculus, called the graph calculus, which is a useful tool for reasoning in the relational calculus and other nonrelational calculi. The graph
Dynamic Programming: a different perspective
 Algorithmic Languages and Calculi
, 1997
"... Dynamic programming has long been used as an algorithm design technique, with various mathematical theories proposed to model it. Here we take a different perspective, using a relational calculus to model the problems and solutions using dynamic programming. This approach serves to shed new light on ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Dynamic programming has long been used as an algorithm design technique, with various mathematical theories proposed to model it. Here we take a different perspective, using a relational calculus to model the problems and solutions using dynamic programming. This approach serves to shed new light on the different styles of dynamic programming, representing them by different search strategies of the treelike space of partial solutions. 1 INTRODUCTION AND HISTORY Dynamic programming is an algorithm design technique for solving many different types of optimization problem, applicable to such diverse fields as operations research (Ecker and Kupferschmid, 1988) and neutron transport theory (Bellman, Kagiwada and Kalaba, 1967). The mathematical theory of the subject dates back to 1957, when Richard Bellman (Bellman, 1957) first popularized the idea, producing a mathematical theory to model multistage decision processes and to solve related optimization problems. He was also the first to i...
Dynamic Programming as a Software Component
 Proceedings of CSCC
"... Abstract: Dynamic programming is usually regarded as a design technique, where each application is designed as an individual program. This contrasts with other techniques such as linear programming, where there exists a single generic program that solves all instances. From a software engineering p ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract: Dynamic programming is usually regarded as a design technique, where each application is designed as an individual program. This contrasts with other techniques such as linear programming, where there exists a single generic program that solves all instances. From a software engineering perspective, the lack of a generic solution to dynamic programming is somewhat unsatisfactory. It would be much preferable if dynamic programming could be understood as a software component, where the ideas common to all its applications are explicit in shared code. In this paper, we argue that such a component does indeed exist, at least for a large class of applications in which the decision process is a sequential scan of the input sequence. We also assess the suitability of C++ for expressing this type of generic program, and argue that the simplicity offered by lazy functional programming is preferable. In particular, functional programs can be manipulated as algebraic expressions. The paper does not present any novel results: it is an introduction to recent work on the formalisation of algorithmic paradigms in software engineering. KeyWords: Dynamic programming; sequential decision process; software component; functional programming; algebra of programming; program derivation. 1
The Greedy Algorithms Class: Formalization, Synthesis and Generalization
, 1995
"... On the first hand, this report studies the class of Greedy Algorithms in order to find an as systematic as possible strategy that could be applied to the specification of some problems to lead to a correct program solving that problem. On the other hand, the standard formalisms underlying the G ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
On the first hand, this report studies the class of Greedy Algorithms in order to find an as systematic as possible strategy that could be applied to the specification of some problems to lead to a correct program solving that problem. On the other hand, the standard formalisms underlying the Greedy Algorithms (matroid, greedoid and matroid embedding) which are dependent on the particular type set are generalized to a formalism independent of any data type based on an algebraic specification setting.
Synthesis Of Greedy Algorithms Using Dominance Relations
"... Greedy algorithms exploit problem structure and constraints to achieve lineartime performance. Yet there is still no completely satisfactory way of constructing greedy algorithms. For example, the Greedy Algorithm of Edmonds depends upon translating a problem into an algebraic structure called a ma ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Greedy algorithms exploit problem structure and constraints to achieve lineartime performance. Yet there is still no completely satisfactory way of constructing greedy algorithms. For example, the Greedy Algorithm of Edmonds depends upon translating a problem into an algebraic structure called a matroid, but the existence of such a translation can be as hard to determine as the existence of a greedy algorithm itself. An alternative characterization of greedy algorithms is in terms of dominance relations, a wellknown algorithmic technique used to prune search spaces. We demonstrate a process by which dominance relations can be methodically derived for a number of greedy algorithms, including activity selection, and prefixfree codes. By incorporating our approach into an existing framework for algorithm synthesis, we demonstrate that it could be the basis for an effective engineering method for greedy algorithms. We also compare our approach with other characterizations of greedy algorithms. 1
The Application of Automated Reasoning to Formal Models of Combinatorial Optimization
 Applied Mathematics and Computation
"... Many formalisms have been proposed over the years to capture combinatorial optimization algorithms such as dynamic programming, branch and bound, and greedy. In 1989 Helman presented a common formalism that captures dynamic programming and branch and bound type algorithms. The formalism was late ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Many formalisms have been proposed over the years to capture combinatorial optimization algorithms such as dynamic programming, branch and bound, and greedy. In 1989 Helman presented a common formalism that captures dynamic programming and branch and bound type algorithms. The formalism was later extended to include greedy algorithms. In this paper, we describe the application of automated reasoning techniques to the domain of our model, in particular considering some representational issues and demonstrating that proofs about the model can be obtained by an automated reasoning program. The longterm objective of this research is to develop a methodology for using automated reasoning to establish new results within the theory, including the derivation of new lower bounds and the discovery (and verification) of new combinatorial search strategies. 1 Introduction Many formalisms have been proposed over the years to capture combinatorial optimization algorithms such as dynami...
Reasoning at Multiple Levels of Abstraction
"... . In order to be successful in complex domains involving hierarchies of dened terms, an automated reasoning program must be able to reason eectively at multiple levels of abstraction. At a minimum, this requires appropriate problem representations and good search strategies. Ideally, the reasoni ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
. In order to be successful in complex domains involving hierarchies of dened terms, an automated reasoning program must be able to reason eectively at multiple levels of abstraction. At a minimum, this requires appropriate problem representations and good search strategies. Ideally, the reasoning program reasons at high levels of abstraction when possible and appeals to arguments at lower levels of abstraction only as necessary. In this article, we describe our early experiences developing representations and search strategies for an application of automated reasoning to a problem domain from theoretical computer science. We then summarize some of the approaches we are considering to permit the automated reasoning program to reason eectively at multiple levels of abstraction. 1 Introduction Automated reasoning has been applied successfully to a number of problems from mathematics and formal logicincluding open questionsand to a variety of design and verication probl...
The Approximation Power of Priority Algorithms
, 2006
"... Greedylike algorithms have been a popular approach in combinatorial optimization, due to their conceptual simplicity and amenability to analysis. Surprisingly, it was only recently that a formal framework for their study emerged. In particular, Borodin, Nielsen and Rackoff introduced the class of p ..."
Abstract
 Add to MetaCart
Greedylike algorithms have been a popular approach in combinatorial optimization, due to their conceptual simplicity and amenability to analysis. Surprisingly, it was only recently that a formal framework for their study emerged. In particular, Borodin, Nielsen and Rackoff introduced the class of priority algorithms as a model for abstracting the main properties of (deterministic) greedylike algorithms; they also showed limitations on the approximation power of such algorithms for various scheduling problems. In this thesis we extend and modify the priorityalgorithm framework so as to make it applicable to a wider class of optimization problems and settings. More precisely, we first derive strong lower bounds on the approximation ratio of priority algorithms for two wellstudied problems, namely facility location and set cover. These are problems for which several greedylike algorithms with good performance guarantees exist. Subsequently, we address the issue of randomization in priority algorithms, and show how to prove bounds on the power of greedylike algorithms with access to random bits. Finally, we propose a model for priority algorithms in the context of graph theoretic optimization problems; the later class of problems turns out to be of particular interest, since it poses certain conceptual challenges when studying
Limitations of Incremental Dynamic Programming
"... Abstract We consider socalled “incremental ” dynamic programming algorithms, and are interested in the number of subproblems produced by them. The classical dynamic programming algorithm for the Knapsack problem is incremental, produces nK subproblems and nK2 relations (wires) between the subprobl ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract We consider socalled “incremental ” dynamic programming algorithms, and are interested in the number of subproblems produced by them. The classical dynamic programming algorithm for the Knapsack problem is incremental, produces nK subproblems and nK2 relations (wires) between the subproblems, where n is the number of items, and K is the knapsack capacity. We show that any incremental algorithm for this problem must produce about nK subproblems, and that about nK logK wires (relations between subproblems) are necessary. This holds even for the SubsetSum problem. We also give upper and lower bounds on the number of subproblems needed to approximate the Knapsack problem. Finally, we show that the Maximum Bipartite Matching problem and the Traveling Salesman problem require exponential number of subproblems. The goal of this paper is to leverage ideas and results of boolean circuit complexity for proving lower bounds on dynamic programming.