Results 1  10
of
2,999
Parallel algorithms for dynamic shortest path problems
 International Transactions in Operational Research
, 2002
"... The development of intelligent transportation systems (ITS) and the resulting need for the solution of a variety of dynamic traffic network models and management problems require fasterthanrealtime computation of shortest path problems in dynamic networks. Recently, a sequential algorithm was dev ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
was developed to compute shortest paths in discrete time dynamic networks from all nodes and all departure times to one destination node. The algorithm is known as algorithm DOT and has an optimal worstcase runningtime complexity. This implies that no algorithm with a better worstcase computational
On Optimal WorstCase Matching
"... Bichromatic reverse nearest neighbor (BRNN) queries have been studied extensively in the literature of spatial databases. Given a set P of serviceproviders and a set O of customers, a BRNN query is to find which customers in O are “interested ” in a given serviceprovider in P. Recently, it has been ..."
Abstract
 Add to MetaCart
to understand but not scalable to large datasets due to its relatively high time/space complexity. SwapChain, which follows a fundamentally different idea from ThresholdAdapt, runs faster than ThresholdAdapt by orders of magnitude and uses significantly less memory. We conducted extensive empirical studies
Fibonacci Heaps and Their Uses in Improved Network optimization algorithms
, 1987
"... In this paper we develop a new data structure for implementing heaps (priority queues). Our structure, Fibonacci heaps (abbreviated Fheaps), extends the binomial queues proposed by Vuillemin and studied further by Brown. Fheaps support arbitrary deletion from an nitem heap in qlogn) amortized tim ..."
Abstract

Cited by 739 (18 self)
 Add to MetaCart
time and all other standard heap operations in o ( 1) amortized time. Using Fheaps we are able to obtain improved running times for several network optimization algorithms. In particular, we obtain the following worstcase bounds, where n is the number of vertices and m the number of edges
The Complexity of Decentralized Control of Markov Decision Processes
 Mathematics of Operations Research
, 2000
"... We consider decentralized control of Markov decision processes and give complexity bounds on the worstcase running time for algorithms that find optimal solutions. Generalizations of both the fullyobservable case and the partiallyobservable case that allow for decentralized control are described. ..."
Abstract

Cited by 411 (46 self)
 Add to MetaCart
We consider decentralized control of Markov decision processes and give complexity bounds on the worstcase running time for algorithms that find optimal solutions. Generalizations of both the fullyobservable case and the partiallyobservable case that allow for decentralized control are described
A NEW POLYNOMIALTIME ALGORITHM FOR LINEAR PROGRAMMING
 COMBINATORICA
, 1984
"... We present a new polynomialtime algorithm for linear programming. In the worst case, the algorithm requires O(tf'SL) arithmetic operations on O(L) bit numbers, where n is the number of variables and L is the number of bits in the input. The running,time of this algorithm is better than the ell ..."
Abstract

Cited by 860 (3 self)
 Add to MetaCart
We present a new polynomialtime algorithm for linear programming. In the worst case, the algorithm requires O(tf'SL) arithmetic operations on O(L) bit numbers, where n is the number of variables and L is the number of bits in the input. The running,time of this algorithm is better than
Worstcase Optimal Join Algorithms
 PODS'12
, 2012
"... Efficient join processing is one of the most fundamental and wellstudied tasks in database research. In this work, we examine algorithms for natural join queries over many relations and describe a novel algorithm to process these queries optimally in terms of worstcase data complexity. Our result b ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
Efficient join processing is one of the most fundamental and wellstudied tasks in database research. In this work, we examine algorithms for natural join queries over many relations and describe a novel algorithm to process these queries optimally in terms of worstcase data complexity. Our result
A volumetric method for building complex models from range images,”
 in Proceedings of the 23rd annual conference on Computer graphics and interactive techniques. ACM,
, 1996
"... Abstract A number of techniques have been developed for reconstructing surfaces by integrating groups of aligned range images. A desirable set of properties for such algorithms includes: incremental updating, representation of directional uncertainty, the ability to fill gaps in the reconstruction, ..."
Abstract

Cited by 1020 (17 self)
 Add to MetaCart
with one range image at a time, we first scanconvert it to a distance function, then combine this with the data already acquired using a simple additive scheme. To achieve space efficiency, we employ a runlength encoding of the volume. To achieve time efficiency, we resample the range image to align
The worstcase time complexity for generating all maximal cliques, COCOON
 Lecture Notes in Computer Science,
, 2004
"... Abstract We present a depthfirst search algorithm for generating all maximal cliques of an undirected graph, in which pruning methods are employed as in the BronKerbosch algorithm. All the maximal cliques generated are output in a treelike form. Subsequently, we prove that its worstcase time co ..."
Abstract

Cited by 82 (1 self)
 Add to MetaCart
Abstract We present a depthfirst search algorithm for generating all maximal cliques of an undirected graph, in which pruning methods are employed as in the BronKerbosch algorithm. All the maximal cliques generated are output in a treelike form. Subsequently, we prove that its worstcase time
WORSTCASE OPTIMAL REASONING WITH FOREST LOGIC PROGRAMS
, 2011
"... This report describes a new worstcase optimal tableau algorithm for reasoning with Forest Logic Programs (FoLPs), a decidable fragment of Open Answer Set Programming. FoLPs are a useful device for tight integration of the Description Logic and the Logic Programming worlds: reasoning with the DL SHO ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
SHOQ can be simulated within the fragment. The algorithm improves on previous results concerning reasoning with the fragment by decreasing the worstcase running time with one exponential level. The decrease in complexity is mainly due to the usage of a new caching rule, whose introduction is highly
Opportunities and challenges for better than worstcase design
 In Asia and South Pacific Design Automation Conference
, 2005
"... The progressive trend of fabrication technologies towards the nanometer regime has created a number of new physical design challenges for computer architects. Design complexity, uncertainty in environmental and fabrication conditions, and singleevent upsets all conspire to compromise system correct ..."
Abstract

Cited by 47 (7 self)
 Add to MetaCart
optimization (TCO) techniques to an adder circuit. Finally, we discuss the challenges and opportunities posed to CAD tools in the context of Better Than WorstCase design. In particular, we describe the additional support required for analyzing runtime characteristics of designs and the many opportunities
Results 1  10
of
2,999