• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 655
Next 10 →

Orienting Fully Dynamic Graphs with Worst-Case Time Bounds

by Tsvi Kopelowitz, Robert Krauthgamer, Ely Porat, Shay Solomon , 2014
"... In edge orientations, the goal is usually to orient (direct) the edges of an undirected network (modeled by a graph) such that all outdegrees are bounded. When the network is fully dynamic, i.e., admits edge insertions and deletions, we wish to maintain such an orientation while keeping a tab on th ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
-degree and a logarithmic amortized update time for all graphs with constant arboricity, which include all planar and excluded-minor graphs. It remained an open question – first proposed by Brodal and Fagerberg, later by Erickson and others – to obtain similar bounds with worst-case update time. We address

A Linear-Time Heuristic for Improving Network Partitions

by C. M. Fiduccia, et al. , 1982
"... An iterative mincut heuristic for partitioning networks is presented whose worst case computation time, per pass, grows linearly with the size of the network. In practice, only a very small number of passes are typically needed, leading to a fast approximation algorithm for mincut partitioning. To d ..."
Abstract - Cited by 524 (0 self) - Add to MetaCart
An iterative mincut heuristic for partitioning networks is presented whose worst case computation time, per pass, grows linearly with the size of the network. In practice, only a very small number of passes are typically needed, leading to a fast approximation algorithm for mincut partitioning

Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm

by Nick Littlestone - Machine Learning , 1988
"... learning Boolean functions, linear-threshold algorithms Abstract. Valiant (1984) and others have studied the problem of learning various classes of Boolean functions from examples. Here we discuss incremental learning of these functions. We consider a setting in which the learner responds to each ex ..."
Abstract - Cited by 773 (5 self) - Add to MetaCart
be expressed as a linear-threshold algorithm. A primary advantage of this algorithm is that the number of mistakes grows only logarithmically with the number of irrelevant attributes in the examples. At the same time, the algorithm is computationally efficient in both time and space. 1.

A New Efficient Algorithm for Computing Gröbner Bases (F4)

by Jean-charles Faugère - IN: ISSAC ’02: PROCEEDINGS OF THE 2002 INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND ALGEBRAIC COMPUTATION , 2002
"... This paper introduces a new efficient algorithm for computing Gröbner bases. To avoid as much as possible intermediate computation, the algorithm computes successive truncated Gröbner bases and it replaces the classical polynomial reduction found in the Buchberger algorithm by the simultaneous reduc ..."
Abstract - Cited by 365 (57 self) - Add to MetaCart
updated and available on a Web page. Even though the new algorithm does not improve the worst case complexity it is several times faster than previous implementations both for integers and modulo computations.

Worst-case and Amortised Optimality in Union-Find (Extended Abstract)

by Stephen Alstrup, Amir M. Ben-Amram, Theis Rauhe , 1999
"... We study the interplay between worst-case and amortised time bounds for the classic Disjoint Set Union problem (Union-Find). We ask whether it is possible to achieve optimal worst-case and amortised bounds simultaneously. Furthermore we would like to allow a tradeoff between the worst-case time for ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
simultaneously. The lower bounds are provided in the cell-probe model as well as in the algebraic real-number RAM, and the upper bounds hold for a RAM with logarithmic word size and a modest instruction set. Our lower bounds show that for worst-case query and update time t q and t u respectively, one must have

On the Limitations of Worst-case Optimal Ray Shooting Algorithms

by László Szirmay-Kalos, Gábor Márton
"... This paper examines the lower-bounds of worst-case complexity measures of ray-shooting algorithms. It demonstrates that ray-shooting requires at least logarithmic time and discusses the strategies how to design such optimal algorithms. It also examines the lower-bounds of storage complexity of logar ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
This paper examines the lower-bounds of worst-case complexity measures of ray-shooting algorithms. It demonstrates that ray-shooting requires at least logarithmic time and discusses the strategies how to design such optimal algorithms. It also examines the lower-bounds of storage complexity

A Trade-Off For Worst-Case Efficient Dictionaries

by Rasmus Pagh , 2000
"... We consider dynamic dictionaries over the universe U = {0, 1}^w on a unit-cost RAM with word size w and a standard instruction set, and present a linear space deterministic dictionary accommodating membership queries in time (log log n)^O(1) and updates in time (log n)^O(1), where n is the size of t ..."
Abstract - Cited by 7 (2 self) - Add to MetaCart
of the set stored. Previous solutions either had query time (log n) 18 or update time 2 !( p log n) in the worst case.

Dynamic Algorithms with Worst-case Performance for Packet Classification

by Pankaj Gupta, Nick Mckeown - IFIP Networking , 2000
"... Packet classification involves - given a set of rules - finding the highest priority rule matching an incoming packet. When designing packet classification algorithms, three metrics need to be considered: query time, update time and storage requirements. The algorithms proposed to-date have been heu ..."
Abstract - Cited by 16 (0 self) - Add to MetaCart
of these algorithms is considered in the worst-case, i.e., without assumptions about structure in the classification rules. They are also designed to perform well (though not necessarily the "best") in each of the metrics simultaneously.

Persistent Linked Structures at Constant Worst-Case Cost

by Ali R. Boroujerdi, Bernard M. E. Moret
"... We present a method for making linked structures with nodes of in-degree not exceeding 1 partially persistent at a worst-case time cost of O(1) per access step and a worst-case time and space cost of O(1) per update step. The last two improve the best previous result, which gave O(1) amortized bo ..."
Abstract - Add to MetaCart
We present a method for making linked structures with nodes of in-degree not exceeding 1 partially persistent at a worst-case time cost of O(1) per access step and a worst-case time and space cost of O(1) per update step. The last two improve the best previous result, which gave O(1) amortized

Worst-Case Versus Average Case Complexity of Ray-Shooting

by László Szirmay-Kalos, Gábor Márton
"... This paper examines worst-case and average-case complexity measures of ray-shooting algorithms in order to find the answer to the question why computer graphics practitioners prefer heuristic methods to extensively studied worst-case optimal algorithms. It demonstrates that ray-shooting requires at ..."
Abstract - Cited by 12 (0 self) - Add to MetaCart
at least logarithmic time in the worst-case and discusses the strategies how to design such worst-case optimal algorithms. It also examines the lower-bounds of storage complexity of logarithmic-time algorithms and concludes that logarithmic time has very high price in terms of required storage. In order
Next 10 →
Results 1 - 10 of 655
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University