Results 1  10
of
655
Orienting Fully Dynamic Graphs with WorstCase Time Bounds
, 2014
"... In edge orientations, the goal is usually to orient (direct) the edges of an undirected network (modeled by a graph) such that all outdegrees are bounded. When the network is fully dynamic, i.e., admits edge insertions and deletions, we wish to maintain such an orientation while keeping a tab on th ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
degree and a logarithmic amortized update time for all graphs with constant arboricity, which include all planar and excludedminor graphs. It remained an open question – first proposed by Brodal and Fagerberg, later by Erickson and others – to obtain similar bounds with worstcase update time. We address
A LinearTime Heuristic for Improving Network Partitions
, 1982
"... An iterative mincut heuristic for partitioning networks is presented whose worst case computation time, per pass, grows linearly with the size of the network. In practice, only a very small number of passes are typically needed, leading to a fast approximation algorithm for mincut partitioning. To d ..."
Abstract

Cited by 524 (0 self)
 Add to MetaCart
An iterative mincut heuristic for partitioning networks is presented whose worst case computation time, per pass, grows linearly with the size of the network. In practice, only a very small number of passes are typically needed, leading to a fast approximation algorithm for mincut partitioning
Learning quickly when irrelevant attributes abound: A new linearthreshold algorithm
 Machine Learning
, 1988
"... learning Boolean functions, linearthreshold algorithms Abstract. Valiant (1984) and others have studied the problem of learning various classes of Boolean functions from examples. Here we discuss incremental learning of these functions. We consider a setting in which the learner responds to each ex ..."
Abstract

Cited by 773 (5 self)
 Add to MetaCart
be expressed as a linearthreshold algorithm. A primary advantage of this algorithm is that the number of mistakes grows only logarithmically with the number of irrelevant attributes in the examples. At the same time, the algorithm is computationally efficient in both time and space. 1.
A New Efficient Algorithm for Computing Gröbner Bases (F4)
 IN: ISSAC ’02: PROCEEDINGS OF THE 2002 INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND ALGEBRAIC COMPUTATION
, 2002
"... This paper introduces a new efficient algorithm for computing Gröbner bases. To avoid as much as possible intermediate computation, the algorithm computes successive truncated Gröbner bases and it replaces the classical polynomial reduction found in the Buchberger algorithm by the simultaneous reduc ..."
Abstract

Cited by 365 (57 self)
 Add to MetaCart
updated and available on a Web page. Even though the new algorithm does not improve the worst case complexity it is several times faster than previous implementations both for integers and modulo computations.
Worstcase and Amortised Optimality in UnionFind (Extended Abstract)
, 1999
"... We study the interplay between worstcase and amortised time bounds for the classic Disjoint Set Union problem (UnionFind). We ask whether it is possible to achieve optimal worstcase and amortised bounds simultaneously. Furthermore we would like to allow a tradeoff between the worstcase time for ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
simultaneously. The lower bounds are provided in the cellprobe model as well as in the algebraic realnumber RAM, and the upper bounds hold for a RAM with logarithmic word size and a modest instruction set. Our lower bounds show that for worstcase query and update time t q and t u respectively, one must have
On the Limitations of Worstcase Optimal Ray Shooting Algorithms
"... This paper examines the lowerbounds of worstcase complexity measures of rayshooting algorithms. It demonstrates that rayshooting requires at least logarithmic time and discusses the strategies how to design such optimal algorithms. It also examines the lowerbounds of storage complexity of logar ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper examines the lowerbounds of worstcase complexity measures of rayshooting algorithms. It demonstrates that rayshooting requires at least logarithmic time and discusses the strategies how to design such optimal algorithms. It also examines the lowerbounds of storage complexity
A TradeOff For WorstCase Efficient Dictionaries
, 2000
"... We consider dynamic dictionaries over the universe U = {0, 1}^w on a unitcost RAM with word size w and a standard instruction set, and present a linear space deterministic dictionary accommodating membership queries in time (log log n)^O(1) and updates in time (log n)^O(1), where n is the size of t ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
of the set stored. Previous solutions either had query time (log n) 18 or update time 2 !( p log n) in the worst case.
Dynamic Algorithms with Worstcase Performance for Packet Classification
 IFIP Networking
, 2000
"... Packet classification involves  given a set of rules  finding the highest priority rule matching an incoming packet. When designing packet classification algorithms, three metrics need to be considered: query time, update time and storage requirements. The algorithms proposed todate have been heu ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
of these algorithms is considered in the worstcase, i.e., without assumptions about structure in the classification rules. They are also designed to perform well (though not necessarily the "best") in each of the metrics simultaneously.
Persistent Linked Structures at Constant WorstCase Cost
"... We present a method for making linked structures with nodes of indegree not exceeding 1 partially persistent at a worstcase time cost of O(1) per access step and a worstcase time and space cost of O(1) per update step. The last two improve the best previous result, which gave O(1) amortized bo ..."
Abstract
 Add to MetaCart
We present a method for making linked structures with nodes of indegree not exceeding 1 partially persistent at a worstcase time cost of O(1) per access step and a worstcase time and space cost of O(1) per update step. The last two improve the best previous result, which gave O(1) amortized
WorstCase Versus Average Case Complexity of RayShooting
"... This paper examines worstcase and averagecase complexity measures of rayshooting algorithms in order to find the answer to the question why computer graphics practitioners prefer heuristic methods to extensively studied worstcase optimal algorithms. It demonstrates that rayshooting requires at ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
at least logarithmic time in the worstcase and discusses the strategies how to design such worstcase optimal algorithms. It also examines the lowerbounds of storage complexity of logarithmictime algorithms and concludes that logarithmic time has very high price in terms of required storage. In order
Results 1  10
of
655