Results 1 - 10
of
175
Cluster Ensembles - A Knowledge Reuse Framework for Combining Multiple Partitions
- Journal of Machine Learning Research
, 2002
"... This paper introduces the problem of combining multiple partitionings of a set of objects into a single consolidated clustering without accessing the features or algorithms that determined these partitionings. We first identify several application scenarios for the resultant 'knowledge reuse&ap ..."
Abstract
-
Cited by 603 (20 self)
- Add to MetaCart
' framework that we call cluster ensembles. The cluster ensemble problem is then formalized as a combinatorial optimization problem in terms of shared mutual information. In addition to a direct maximization approach, we propose three effective and efficient techniques for obtaining high-quality combiners
Scaling MPE Inference for Constrained Continuous Markov Random Fields with Consensus Optimization
"... Probabilistic graphical models are powerful tools for analyzing constrained, continuous domains. However, finding most-probable explanations (MPEs) in these models can be computationally expensive. In this paper, we improve the scalability of MPE inference in a class of graphical models with piecewi ..."
Abstract
-
Cited by 17 (14 self)
- Add to MetaCart
with piecewise-linear and piecewise-quadratic dependencies and linear constraints over continuous domains. We derive algorithms based on a consensus-optimization framework and demonstrate their superior performance over state of the art. We show empirically that in a large-scale voter-preference modeling problem
Constrained consensus and optimization in multi-agent networks
- IEEE TRANSACTIONS ON AUTOMATIC CONTROL
, 2008
"... We present distributed algorithms that can be used by multiple agents to align their estimates with a particular value over a network with time-varying connectivity. Our framework is general in that this value can represent a consensus value among multiple agents or an optimal solution of an optimiz ..."
Abstract
-
Cited by 115 (8 self)
- Add to MetaCart
We present distributed algorithms that can be used by multiple agents to align their estimates with a particular value over a network with time-varying connectivity. Our framework is general in that this value can represent a consensus value among multiple agents or an optimal solution
Subgradient methods and consensus algorithms for solving convex optimization problems,” in
- Proc. IEEE Conference on Decision and Control, Cancun,
, 2008
"... Abstract-In this paper we propose a subgradient method for solving coupled optimization problems in a distributed way given restrictions on the communication topology. The iterative procedure maintains local variables at each node and relies on local subgradient updates in combination with a consen ..."
Abstract
-
Cited by 52 (7 self)
- Add to MetaCart
consensus process. The local subgradient steps are applied simultaneously as opposed to the standard sequential or cyclic procedure. We study convergence properties of the proposed scheme using results from consensus theory and approximate subgradient methods. The framework is illustrated on an optimal
Consensus with Robustness to Outliers via Distributed Optimization
"... Abstract — Over the past few years, a number of distributed algorithms have been developed for integrating the measurements acquired by a wireless sensor network. Among them, average consensus algorithms have drawn significant attention due to a number of practical advantages, such as robustness to ..."
Abstract
-
Cited by 4 (0 self)
- Add to MetaCart
Euclidean (L2) loss function, which is known to be sensitive to outliers. In this paper, we propose a distributed optimization framework that can handle outliers in the measurements. The proposed framework generalizes consensus algorithms to robust loss functions that are strictly convex or convex
Universal Grammar and second language acquisition
, 1989
"... In this paper, I provide an overview of differing perspectives on the role of Universal Grammar (UG) in second language acquisition (SLA). I will suggest that we must not lose sight of the fact that UG is a theory which provides constraints on linguistic representation. At issue, then, is whether in ..."
Abstract
-
Cited by 152 (1 self)
- Add to MetaCart
consensus that certain properties of language are too abstract, subtle and complex to be learned without postulating innate and specifically linguistic constraints. Much of the work on UG in SLA has been conducted within the GB framework.
Optimal Polynomial Filtering for Accelerating Distributed Consensus
"... Abstract—In the past few years, the problem of distributed consensus has received a lot of attention, particularly in the framework of ad hoc sensor networks. Most methods proposed in the literature attack this problem by distributed linear iterative algorithms, with asymptotic convergence of the co ..."
Abstract
- Add to MetaCart
Abstract—In the past few years, the problem of distributed consensus has received a lot of attention, particularly in the framework of ad hoc sensor networks. Most methods proposed in the literature attack this problem by distributed linear iterative algorithms, with asymptotic convergence
Globally optimal consensus set maximization through rotation search
- ACCV 2012, PART II. LNCS
, 2013
"... A popular approach to detect outliers in a data set is to find the largest consensus set, that is to say maximizing the number of inliers and estimating the underlying model. RANSAC is the most widely used method for this aim but is non-deterministic and does not guarantee to return the optimal so ..."
Abstract
-
Cited by 5 (1 self)
- Add to MetaCart
solution. In this paper, we consider a rotation model and we present a new approach that performs consensus set maximization in a mathematically guaranteed globally optimal way. We solve the problem by a branch-and-bound framework associated with a rotation space search. Our mathematical formulation can
A Hypergraph-Partitioned Vertex Programming Approach for Large-scale Consensus Optimization
"... In modern data science problems, techniques for extracting value from big data require performing large-scale optimization over heterogenous, irregularly structured data. Much of this data is best represented as multi-relational graphs, making vertex-programming abstractions such as those of Pregel ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
of Pregel and GraphLab ideal fits for modern large-scale data analysis. In this paper, we describe a vertex-programming implementation of a popular consensus optimization technique known as the alternating direction method of multipliers (ADMM) [1]. ADMM consensus optimization allows the elegant solution
Framework for TCP Throughput Testing
, 2011
"... This framework describes a practical methodology for measuring endto-end TCP Throughput in a managed IP network. The goal is to provide a better indication in regard to user experience. In this framework, TCP and IP parameters are specified to optimize TCP Throughput. Status of This Memo This docume ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
This framework describes a practical methodology for measuring endto-end TCP Throughput in a managed IP network. The goal is to provide a better indication in regard to user experience. In this framework, TCP and IP parameters are specified to optimize TCP Throughput. Status of This Memo
Results 1 - 10
of
175