Results 1  10
of
435
Cluster Ensembles  A Knowledge Reuse Framework for Combining Multiple Partitions
 Journal of Machine Learning Research
, 2002
"... This paper introduces the problem of combining multiple partitionings of a set of objects into a single consolidated clustering without accessing the features or algorithms that determined these partitionings. We first identify several application scenarios for the resultant 'knowledge reuse&ap ..."
Abstract

Cited by 603 (20 self)
 Add to MetaCart
' framework that we call cluster ensembles. The cluster ensemble problem is then formalized as a combinatorial optimization problem in terms of shared mutual information. In addition to a direct maximization approach, we propose three effective and efficient techniques for obtaining highquality combiners
SPADE: An efficient algorithm for mining frequent sequences
 Machine Learning
, 2001
"... Abstract. In this paper we present SPADE, a new algorithm for fast discovery of Sequential Patterns. The existing solutions to this problem make repeated database scans, and use complex hash structures which have poor locality. SPADE utilizes combinatorial properties to decompose the original proble ..."
Abstract

Cited by 437 (16 self)
 Add to MetaCart
Abstract. In this paper we present SPADE, a new algorithm for fast discovery of Sequential Patterns. The existing solutions to this problem make repeated database scans, and use complex hash structures which have poor locality. SPADE utilizes combinatorial properties to decompose the original
Coalitions Among Computationally Bounded Agents
 Artificial Intelligence
, 1997
"... This paper analyzes coalitions among selfinterested agents that need to solve combinatorial optimization problems to operate e ciently in the world. By colluding (coordinating their actions by solving a joint optimization problem) the agents can sometimes save costs compared to operating individua ..."
Abstract

Cited by 203 (26 self)
 Add to MetaCart
This paper analyzes coalitions among selfinterested agents that need to solve combinatorial optimization problems to operate e ciently in the world. By colluding (coordinating their actions by solving a joint optimization problem) the agents can sometimes save costs compared to operating
Combinatorial problems in highperformance computing
 Presentation at Dagstuhl Seminar on Combinatorial Scientific Computing (09061
, 2009
"... Partitioning is of fundamental importance in highperformance computing: partitioning the data and the associated computational work in an optimal manner leads to good load balance and minimal communication in parallel computations on modern architectures. Often, the computation is irregular and the ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
and the data set is described by a sparse matrix, a graph, or a hypergraph. This results in a combinatorial partitioning problem. Here, we will focus on partitioning a sparse matrix for parallel computation as the core problem. We survey various partitioning methods for the parallel computation of a sparse
Optimization by Simulated . . . NUMBER PARTITIONING
, 1991
"... This is the second in a series of three papers that empirically examine the competitiveness of simulated annealing in certain wellstudied domains of combinatorial optimization. Simulated annealing is a randomized technique proposed by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi for improving loca ..."
Abstract
 Add to MetaCart
This is the second in a series of three papers that empirically examine the competitiveness of simulated annealing in certain wellstudied domains of combinatorial optimization. Simulated annealing is a randomized technique proposed by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi for improving
The Gibbs Cloner for Combinatorial Optimization, Counting and Sampling
, 2008
"... We present a randomized algorithm, called the cloning algorithm, for approximating the solutions of quite general NPhard combinatorial optimization problems, counting, rareevent estimation and uniform sampling on complex regions. Similar to the algorithms of Diaconis–Holmes–Ross and Botev–Kroese ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
We present a randomized algorithm, called the cloning algorithm, for approximating the solutions of quite general NPhard combinatorial optimization problems, counting, rareevent estimation and uniform sampling on complex regions. Similar to the algorithms of Diaconis–Holmes–Ross and Botev
Using Prediction to Improve Combinatorial Optimization Search
 In Proc. of 6th Int'l Workshop on Artificial Intelligence and Statistics
, 1997
"... To appear in AISTATS97 This paper describes a statistical approach to improving the performance of stochastic search algorithms for optimization. Given a search algorithm A, we learn to predict the outcome of A as a function of state features along a search trajectory. Predictions are made by a fun ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
produced very promising results on two largescale domains. 1 Introduction The problem of combinatorial optimization is simply stated: given a finite state space X and an objective function f : X ! !, find an optimal state x = argmin x2X f(x). Typically, X is huge, and finding an optimal x
Combinatorial structures in nonlinear programming
 Operations Research
, 2004
"... Nonsmoothness and nonconvexity in optimization problems often arise because a combinatorial structure is imposed on smooth or convex data. The combinatorial aspect can be explicit, e.g. through the use of ”max”, ”min”, or ”if ” statements in a model, or implicit as in the case of bilevel optimizat ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Nonsmoothness and nonconvexity in optimization problems often arise because a combinatorial structure is imposed on smooth or convex data. The combinatorial aspect can be explicit, e.g. through the use of ”max”, ”min”, or ”if ” statements in a model, or implicit as in the case of bilevel
ACCELERATED BAYESIAN OPTIMIZATION ALGORITHMS FOR ADVANCED HYPERGRAPH PARTITIONING
, 2003
"... The paper summarizes our recent work on the design, analysis and applications of the Bayesian optimization algorithm (BOA) and its advanced accelerated variants for solving complex – sometimes NPcomplete – combinatorial optimization problems from circuit design. We review the methods for accelerati ..."
Abstract
 Add to MetaCart
for accelerating BOA for hypergraphpartitioning problem. The first method accelerates the convergence of sequential BOA by utilizing specific knowledge about the optimized problem and the second method is based on the parallel construction of a probabilistic model. In the experimental part we analyze
Free Energy for a Class of Combinatorial Optimization Problems and its Asymptotics
, 2015
"... The free energy, that is the expected logpartition function, carries much information about a combinatorial optimization problem. We believe in particular that it might help evaluate the robustness of an optimization procedure to noise in the data. In this context, we study the asymptotical behavio ..."
Abstract
 Add to MetaCart
The free energy, that is the expected logpartition function, carries much information about a combinatorial optimization problem. We believe in particular that it might help evaluate the robustness of an optimization procedure to noise in the data. In this context, we study the asymptotical
Results 1  10
of
435