• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 102,474
Next 10 →

A Data Locality Optimizing Algorithm

by Michael E. Wolf, Monica S. Lam , 1991
"... This paper proposes an algorithm that improves the locality of a loop nest by transforming the code via interchange, reversal, skewing and tiling. The loop transformation algorithm is based on two concepts: a mathematical formulation of reuse and locality, and a loop transformation theory that unifi ..."
Abstract - Cited by 804 (16 self) - Add to MetaCart
that unifies the various transforms as unimodular matrix transformations. The algorithm has been implemented in the SUIF (Stanford University Intermediate Format) compiler, and is successful in optimizing codes such as matrix multiplication, successive over-relaxation (SOR), LU decomposition without pivoting

Fibonacci Heaps and Their Uses in Improved Network optimization algorithms

by Michael L. Fredman, Robert Endre Tarjan , 1987
"... In this paper we develop a new data structure for implementing heaps (priority queues). Our structure, Fibonacci heaps (abbreviated F-heaps), extends the binomial queues proposed by Vuillemin and studied further by Brown. F-heaps support arbitrary deletion from an n-item heap in qlogn) amortized tim ..."
Abstract - Cited by 739 (18 self) - Add to MetaCart
time and all other standard heap operations in o ( 1) amortized time. Using F-heaps we are able to obtain improved running times for several network optimization algorithms. In particular, we obtain the following worst-case bounds, where n is the number of vertices and m the number of edges

Hierarchical Bayesian Optimization Algorithm = Bayesian Optimization Algorithm + Niching + Local Structures

by Martin Pelikan, David. E. Goldberg , 2001
"... The paper describes the hierarchical Bayesian optimization algorithm which combines the Bayesian optimization algorithm, local structures in Bayesian networks, and a powerful niching technique. The proposed algorithm is able to solve hierarchical traps and other difficult problems very efficiently. ..."
Abstract - Cited by 329 (70 self) - Add to MetaCart
The paper describes the hierarchical Bayesian optimization algorithm which combines the Bayesian optimization algorithm, local structures in Bayesian networks, and a powerful niching technique. The proposed algorithm is able to solve hierarchical traps and other difficult problems very efficiently.

An Optimal Algorithm for Approximate Nearest Neighbor Searching in Fixed Dimensions

by Sunil Arya, David M. Mount, Nathan S. Netanyahu, Ruth Silverman, Angela Y. Wu - ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS , 1994
"... Consider a set S of n data points in real d-dimensional space, R d , where distances are measured using any Minkowski metric. In nearest neighbor searching we preprocess S into a data structure, so that given any query point q 2 R d , the closest point of S to q can be reported quickly. Given any po ..."
Abstract - Cited by 984 (32 self) - Add to MetaCart
Consider a set S of n data points in real d-dimensional space, R d , where distances are measured using any Minkowski metric. In nearest neighbor searching we preprocess S into a data structure, so that given any query point q 2 R d , the closest point of S to q can be reported quickly. Given any positive real ffl, a data point p is a (1 + ffl)-approximate nearest neighbor of q if its distance from q is within a factor of (1 + ffl) of the distance to the true nearest neighbor. We show that it is possible to preprocess a set of n points in R d in O(dn log n) time and O(dn) space, so that given a query point q 2 R d , and ffl ? 0, a (1 + ffl)-approximate nearest neighbor of q can be computed in O(c d;ffl log n) time, where c d;ffl d d1 + 6d=ffle d is a factor depending only on dimension and ffl. In general, we show that given an integer k 1, (1 + ffl)-approximations to the k nearest neighbors of q can be computed in additional O(kd log n) time.

Optimal Aggregation Algorithms for Middleware

by Ronald Fagin, Amnon Lotem , Moni Naor - IN PODS , 2001
"... Assume that each object in a database has m grades, or scores, one for each of m attributes. For example, an object can have a color grade, that tells how red it is, and a shape grade, that tells how round it is. For each attribute, there is a sorted list, which lists each object and its grade under ..."
Abstract - Cited by 717 (4 self) - Add to MetaCart
must access every object in the database, to find its grade under each attribute. Fagin has given an algorithm (“Fagin’s Algorithm”, or FA) that is much more efficient. For some monotone aggregation functions, FA is optimal with high probability in the worst case. We analyze an elegant and remarkably

Ant algorithms for discrete optimization

by Marco Dorigo, Gianni Di Caro, Luca M. Gambardella - ARTIFICIAL LIFE , 1999
"... This article presents an overview of recent work on ant algorithms, that is, algorithms for discrete optimization that took inspiration from the observation of ant colonies’ foraging behavior, and introduces the ant colony optimization (ACO) metaheuristic. In the first part of the article the basic ..."
Abstract - Cited by 489 (42 self) - Add to MetaCart
This article presents an overview of recent work on ant algorithms, that is, algorithms for discrete optimization that took inspiration from the observation of ant colonies’ foraging behavior, and introduces the ant colony optimization (ACO) metaheuristic. In the first part of the article the basic

Dynamic programming algorithm optimization for spoken word recognition

by Hiroaki Sakoe, Seibi Chiba - IEEE TRANSACTIONS ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING , 1978
"... This paper reports on an optimum dynamic programming (DP) based time-normalization algorithm for spoken word recognition. First, a general principle of time-normalization is given using timewarping function. Then, two time-normalized distance definitions, ded symmetric and asymmetric forms, are der ..."
Abstract - Cited by 788 (3 self) - Add to MetaCart
words in different The effective slope constraint characteristic is qualitatively analyzed, and the optimum slope constraint condition is determined through experiments. The optimized algorithm is then extensively subjected to experimentat comparison with various DP-algorithms, previously applied

No Free Lunch Theorems for Optimization

by David H. Wolpert, et al. , 1997
"... A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving. A number of “no free lunch ” (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset by performan ..."
Abstract - Cited by 961 (10 self) - Add to MetaCart
A framework is developed to explore the connection between effective optimization algorithms and the problems they are solving. A number of “no free lunch ” (NFL) theorems are presented which establish that for any algorithm, any elevated performance over one class of problems is offset

The Cache Performance and Optimizations of Blocked Algorithms

by Monica S. Lam, Edward E. Rothberg, Michael E. Wolf - In Proceedings of the Fourth International Conference on Architectural Support for Programming Languages and Operating Systems , 1991
"... Blocking is a well-known optimization technique for improving the effectiveness of memory hierarchies. Instead of operating on entire rows or columns of an array, blocked algorithms operate on submatrices or blocks, so that data loaded into the faster levels of the memory hierarchy are reused. This ..."
Abstract - Cited by 574 (5 self) - Add to MetaCart
Blocking is a well-known optimization technique for improving the effectiveness of memory hierarchies. Instead of operating on entire rows or columns of an array, blocked algorithms operate on submatrices or blocks, so that data loaded into the faster levels of the memory hierarchy are reused

An Overview of Evolutionary Algorithms in Multiobjective Optimization

by Carlos M. Fonseca, Peter J. Fleming - Evolutionary Computation , 1995
"... The application of evolutionary algorithms (EAs) in multiobjective optimization is currently receiving growing interest from researchers with various backgrounds. Most research in this area has understandably concentrated on the selection stage of EAs, due to the need to integrate vectorial performa ..."
Abstract - Cited by 492 (13 self) - Add to MetaCart
The application of evolutionary algorithms (EAs) in multiobjective optimization is currently receiving growing interest from researchers with various backgrounds. Most research in this area has understandably concentrated on the selection stage of EAs, due to the need to integrate vectorial
Next 10 →
Results 1 - 10 of 102,474
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University