Results 1  10
of
16
Parameterized Complexity: A Framework for Systematically Confronting Computational Intractability
 DIMACS Series in Discrete Mathematics and Theoretical Computer Science
, 1997
"... In this paper we give a programmatic overview of parameterized computational complexity in the broad context of the problem of coping with computational intractability. We give some examples of how fixedparameter tractability techniques can deliver practical algorithms in two different ways: (1) by ..."
Abstract

Cited by 68 (16 self)
 Add to MetaCart
In this paper we give a programmatic overview of parameterized computational complexity in the broad context of the problem of coping with computational intractability. We give some examples of how fixedparameter tractability techniques can deliver practical algorithms in two different ways: (1) by providing useful exact algorithms for small parameter ranges, and (2) by providing guidance in the design of heuristic algorithms. In particular, we describe an improved FPT kernelization algorithm for Vertex Cover, a practical FPT algorithm for the Maximum Agreement Subtree (MAST) problem parameterized by the number of species to be deleted, and new general heuristics for these problems based on FPT techniques. In the course of making this overview, we also investigate some structural and hardness issues. We prove that an important naturally parameterized problem in artificial intelligence, STRIPS Planning (where the parameter is the size of the plan) is complete for W [1]. As a corollary, this implies that kStep Reachability for Petri Nets is complete for W [1]. We describe how the concept of treewidth can be applied to STRIPS Planning and other problems of logic to obtain FPT results. We describe a surprising structural result concerning the top end of the parameterized complexity hierarchy: the naturally parameterized Graph kColoring problem cannot be resolved with respect to XP either by showing membership in XP, or by showing hardness for XP without settling the P = NP question one way or the other.
A Survey of Adaptive Sorting Algorithms
, 1992
"... Introduction and Survey; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems  Sorting and Searching; E.5 [Data]: Files  Sorting/searching; G.3 [Mathematics of Computing]: Probability and Statistics  Probabilistic algorithms; E.2 [Data Storage Represe ..."
Abstract

Cited by 65 (3 self)
 Add to MetaCart
Introduction and Survey; F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems  Sorting and Searching; E.5 [Data]: Files  Sorting/searching; G.3 [Mathematics of Computing]: Probability and Statistics  Probabilistic algorithms; E.2 [Data Storage Representation]: Composite structures, linked representations. General Terms: Algorithms, Theory. Additional Key Words and Phrases: Adaptive sorting algorithms, Comparison trees, Measures of disorder, Nearly sorted sequences, Randomized algorithms. A Survey of Adaptive Sorting Algorithms 2 CONTENTS INTRODUCTION I.1 Optimal adaptivity I.2 Measures of disorder I.3 Organization of the paper 1.WORSTCASE ADAPTIVE (INTERNAL) SORTING ALGORITHMS 1.1 Generic Sort 1.2 CookKim division 1.3 Partition Sort 1.4 Exponential Search 1.5 Adaptive Merging 2.EXPECTEDCASE ADAPTIV
A Survey on Software Clone Detection Research
 SCHOOL OF COMPUTING TR 2007541, QUEEN’S UNIVERSITY
, 2007
"... Code duplication or copying a code fragment and then reuse by pasting with or without any modifications is a well known code smell in software maintenance. Several studies show that about 5 % to 20 % of a software systems can contain duplicated code, which is basically the results of copying existin ..."
Abstract

Cited by 60 (7 self)
 Add to MetaCart
Code duplication or copying a code fragment and then reuse by pasting with or without any modifications is a well known code smell in software maintenance. Several studies show that about 5 % to 20 % of a software systems can contain duplicated code, which is basically the results of copying existing code fragments and using then by pasting with or without minor modifications. One of the major shortcomings of such duplicated fragments is that if a bug is detected in a code fragment, all the other fragments similar to it should be investigated to check the possible existence of the same bug in the similar fragments. Refactoring of the duplicated code is another prime issue in software maintenance although several studies claim that refactoring of certain clones are not desirable and there is a risk of removing them. However, it is also widely agreed that clones should at least be detected. In this paper, we survey the state of the art in clone detection research. First, we describe the clone terms commonly used in the literature along with their corresponding mappings to the commonly used clone types. Second, we provide a review of the existing
A Short History of Computational Complexity
 IEEE CONFERENCE ON COMPUTATIONAL COMPLEXITY
, 2002
"... this article mention all of the amazing research in computational complexity theory. We survey various areas in complexity choosing papers more for their historical value than necessarily the importance of the results. We hope that this gives an insight into the richness and depth of this still quit ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
this article mention all of the amazing research in computational complexity theory. We survey various areas in complexity choosing papers more for their historical value than necessarily the importance of the results. We hope that this gives an insight into the richness and depth of this still quite young eld
Computational Tractability: The View From Mars
 Bulletin of the European Association of Theoretical Computer Science
"... We describe a point of view about the parameterized computational complexity framework in the broad context of one of the central issues of theoretical computer science as a field: the problem of systematically coping with computational intractability. Those already familiar with the basic ideas of ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We describe a point of view about the parameterized computational complexity framework in the broad context of one of the central issues of theoretical computer science as a field: the problem of systematically coping with computational intractability. Those already familiar with the basic ideas of parameterized complexity will nevertheless find here something new: the emerging systematic connections between fixedparameter tractability techniques and the design of useful heuristic algorithms, and also perhaps the philosophical maturation of the parameterized complexity program.
FPGA Global Routing Based on a New Congestion Metric
 Proceedings of the International Conference on Computer Aided Design
, 1995
"... Unlike traditional ASIC routing, the feasibility of routing in FPGA's is constrained not only by the available space within a routing region, but also by the routing capacity of a switch block. Recent work [6] has established the switchblock capacity as a superior congestioncontrol metric for ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Unlike traditional ASIC routing, the feasibility of routing in FPGA's is constrained not only by the available space within a routing region, but also by the routing capacity of a switch block. Recent work [6] has established the switchblock capacity as a superior congestioncontrol metric for FPGA global routing. However, the work has two deficiencies: (1) its algorithm for computing the switchblock capacity is not efficient, and (2) it, as well as the other recent works [1, 4, 14], only modeled one type of routing segments singlelength lines. To remedy the deficiencies, we present in this paper efficient algorithms for obtaining the switchblock capacity and a graph modeling for routing on the newgeneration FPGA's with a versatile set of segment lengths. Experiments show that our algorithms dramatically reduce the run times for obtaining the switchblock capacities. Experiments with a global router based on the switchblock and channel densities for congestion control show a signi...
FiftyPlus Years of Combinatorial Integer Programming
, 2009
"... Throughout the history of integer programming, the field has been guided by research into solution approaches to combinatorial problems. We discuss some of the highlights and defining moments of this area. 1 Combinatorial integer programming Integerprogramming models arise naturally in optimization ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Throughout the history of integer programming, the field has been guided by research into solution approaches to combinatorial problems. We discuss some of the highlights and defining moments of this area. 1 Combinatorial integer programming Integerprogramming models arise naturally in optimization problems over combinatorial structures, most notably in problems on graphs and general set systems. The translation from combinatorics to the language of integer programming is often straightforward, but the new rendering typically suggests direct lines of attack via linear programming. As an example, consider the stableset problem in graphs. Given a graph G = (V, E) with vertices V and edges E, a stable set of G is a subset S ⊆ V such that no two vertices in S are joined by an edge. The stableset problem is to find a maximumcardinality stable set. To formulate this as an integerprogramming (IP) problem, consider a vector of variables x = (xv: v ∈ V) and identify a set U ⊆ V with its characteristic vector ¯x, defined as ¯xv = 1 if v ∈ U and ¯xv = 0 otherwise. For e ∈ E write e = (u, v), where u and v are the ends of the edge. The stableset problem is equivalent to the IP model
Averagecase Analysis of Equality of Binary Trees Under the BST Probability Model
, 1994
"... In this paper a simple algorithm to test equality of binary trees currently used in symbolic computation, unification, etc. is investigated. Surprisingly enough, it takes O(1) steps on average to decide if a given pair of trees of total size n are equal or not if the uniform probability model for th ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
In this paper a simple algorithm to test equality of binary trees currently used in symbolic computation, unification, etc. is investigated. Surprisingly enough, it takes O(1) steps on average to decide if a given pair of trees of total size n are equal or not if the uniform probability model for the input is assumed. Moreover, other similar algorithms have qualitatively the same average complexity behavior. In this paper, we analyze this average complexity when the socalled bst probability model is assumed. The analysis is itself more complex although feasible, involving partial differential equations and singularity analysis of Bessel functions. Nevertheless, partial differential equations are generally unsolvable, like the one which is derived from the bivariate recurrences for the equality, and an indirect mechanism solving a simpler equation and showing asymptotical equivalence of solutions is used to obtain the main result : testing equality of a pair of binary trees of total si...
Call for papers
 MISQ Special Issue on Design Science Research [Electronic Version]. MIS Quarterly. Retrieved Sept 30, 2006 from http://www.misq.org/DesignScience.pdf
, 2006
"... Research in IT must address the design tasks faced by practitioners. Real problems must be properly conceptualized and represented, appropriate techniques for their solution must be constructed, and solutions must be implemented and evaluated using appropriate criteria. If significant progress is t ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Research in IT must address the design tasks faced by practitioners. Real problems must be properly conceptualized and represented, appropriate techniques for their solution must be constructed, and solutions must be implemented and evaluated using appropriate criteria. If significant progress is to be made, IT research must also develop an understanding of how and why IT systems work or do not work. Such an understanding must tie together natural laws governing IT systems with natural laws governing the environments in which they operate. This paper presents a two dimensional framework for research in information technology. The first dimension is based on broad types of design and natural science research activities: build, evaluate, theorize, and justify. The second dimension is based on broad types of outputs produced by design research: representational constructs, models, methods, and