Results 1  10
of
7,328
Clique Partitions, Graph Compression and Speedingup Algorithms
 Journal of Computer and System Sciences
, 1991
"... We first consider the problem of partitioning the edges of a graph G into bipartite cliques such that the total order of the cliques is minimized, where the order of a clique is the number of vertices in it. It is shown that the problem is NPcomplete. We then prove the existence of a partition of s ..."
Abstract

Cited by 88 (3 self)
 Add to MetaCart
of small total order in a sufficiently dense graph and devise an efficient algorithm to compute such a partition. It turns out that our algorithm exhibits a tradeoff between the total order of the partition and the running time. Next, we define the notion of a compression of a graph G and use the result
On effective procedures for speeding up algorithms
, 1971
"... This paper is concerned with the nature of speedups. Let f be any recursive function. We show that there is no effective procedure for going from an algorithm for f to another algorithm for f that is significantly faster on all but a finite number of inputs. On the other hand, for a large class of ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
This paper is concerned with the nature of speedups. Let f be any recursive function. We show that there is no effective procedure for going from an algorithm for f to another algorithm for f that is significantly faster on all but a finite number of inputs. On the other hand, for a large class
Speeding up algorithms of SOM Family for Large and High Dimensional Databases
 In Workshop on Self organizing Maps (WSOM’03
, 2003
"... Abstract — In this paper, Spatial Access Methods, like RTree and kd Tree, for indexing data, are used to speed up the training process and performance of data analysis methods which learning algorithms are kind of competitive learning. Often, the search for the winning neuron is performed sequenti ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract — In this paper, Spatial Access Methods, like RTree and kd Tree, for indexing data, are used to speed up the training process and performance of data analysis methods which learning algorithms are kind of competitive learning. Often, the search for the winning neuron is performed
Speeding Up Algorithmic Debugging Using Balanced Execution Trees
"... Abstract. Algorithmic debugging is a debugging technique that uses a data structure representing all computations performed during the execution of a program. This data structure is the socalled Execution Tree and it strongly influences the performance of the technique. In this work we present a t ..."
Abstract
 Add to MetaCart
Abstract. Algorithmic debugging is a debugging technique that uses a data structure representing all computations performed during the execution of a program. This data structure is the socalled Execution Tree and it strongly influences the performance of the technique. In this work we present a
Speedingup Reinforcement Learning with
 Proceedings of the Twelfth International Conference on Arti Neural Networks (ICANN), Lecture Notes in Computer Science (LNCS) 2415
, 2001
"... In recent years hierarchical concepts of temporal abstraction have been integrated in the reinforcement learning framework to improve scalability. However, existing approaches are limited to domains where a decomposition into subtasks is known a priori. In this paper we propose the concept of explic ..."
Abstract
 Add to MetaCart
of explicitly selecting time scale related actions if no subgoalrelated abstract actions are available. This is realised with multistep actions on dierent time scales that are combined in one single action set. The special structure of the action set is exploited in the MSAQ learning algorithm. By learning
Reasoning the fast and frugal way: Models of bounded rationality.
 Psychological Review,
, 1996
"... Humans and animals make inferences about the world under limited time and knowledge. In contrast, many models of rational inference treat the mind as a Laplacean Demon, equipped with unlimited time, knowledge, and computational might. Following H. Simon's notion of satisncing, the authors have ..."
Abstract

Cited by 611 (30 self)
 Add to MetaCart
have proposed a family of algorithms based on a simple psychological mechanism: onereason decision making. These fast and frugal algorithms violate fundamental tenets of classical rationality: They neither look up nor integrate all information. By computer simulation, the authors held a competition
Efficient belief propagation for early vision
 In CVPR
, 2004
"... Markov random field models provide a robust and unified framework for early vision problems such as stereo, optical flow and image restoration. Inference algorithms based on graph cuts and belief propagation yield accurate results, but despite recent advances are often still too slow for practical u ..."
Abstract

Cited by 515 (8 self)
 Add to MetaCart
is important for problems such as optical flow or image restoration that have a large label set. A second technique makes it possible to obtain good results with a small fixed number of message passing iterations, independent of the size of the input images. Taken together these techniques speed up
A scaled conjugate gradient algorithm for fast supervised learning
 NEURAL NETWORKS
, 1993
"... A supervised learning algorithm (Scaled Conjugate Gradient, SCG) with superlinear convergence rate is introduced. The algorithm is based upon a class of optimization techniques well known in numerical analysis as the Conjugate Gradient Methods. SCG uses second order information from the neural netwo ..."
Abstract

Cited by 451 (0 self)
 Add to MetaCart
FletcherGoldfarbShanno memoryless quasiNewton algorithm (BFGS) [1]. SCG yields a speedup of at least an order of magnitude relative to BP. The speedup depends on the convergence criterion, i.e., the bigger demand for reduction in error the bigger the speedup. SCG is fully automated including no user dependent parameters
Tandem repeats finder: a program to analyze DNA sequences
, 1999
"... A tandem repeat in DNA is two or more contiguous, approximate copies of a pattern of nucleotides. Tandem repeats have been shown to cause human disease, may play a variety of regulatory and evolutionary roles and are important laboratory and analytic tools. Extensive knowledge about pattern size, co ..."
Abstract

Cited by 961 (9 self)
 Add to MetaCart
repeats by percent identity and frequency of indels between adjacent pattern copies and use statistically based recognition criteria. We demonstrate the algorithm's speed and its ability to detect tandem repeats that have undergone extensive mutational change by analyzing four sequences: the human
Results 1  10
of
7,328