Results 1  10
of
13,059
Optimal Separable Partitioning in the Plane
, 1995
"... Sets of points are called separable if their convex hulls are disjoint. We suggest a technique for optimally partitioning of a set N into two separable subsets, N 1 ; N 2 . We assume that a monotone measure, ¯, is defined over the subsets of N , and the objective is to minimize maxf¯(N 1 ); ¯(N 2 )g ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Sets of points are called separable if their convex hulls are disjoint. We suggest a technique for optimally partitioning of a set N into two separable subsets, N 1 ; N 2 . We assume that a monotone measure, ¯, is defined over the subsets of N , and the objective is to minimize maxf¯(N 1 ); ¯(N 2
Optimally Separating Sequences
 Genome Informatics
, 2001
"... We consider the problem of separating two distinct classes of k similar sequences of length n over an alphabet of size s that have been optimally multialigned. An objective function based on minimizing the consensus score of the separated halves is introduced and we present an O(k 3 n) heuristic al ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We consider the problem of separating two distinct classes of k similar sequences of length n over an alphabet of size s that have been optimally multialigned. An objective function based on minimizing the consensus score of the separated halves is introduced and we present an O(k 3 n) heuristic
Evolutionary Algorithms for Multiobjective Optimization
, 2002
"... Multiple, often conflicting objectives arise naturally in most realworld optimization scenarios. As evolutionary algorithms possess several characteristics due to which they are well suited to this type of problem, evolutionbased methods have been used for multiobjective optimization for more than ..."
Abstract

Cited by 450 (13 self)
 Add to MetaCart
than a decade. Meanwhile evolutionary multiobjective optimization has become established as a separate subdiscipline combining the fields of evolutionary computation and classical multiple criteria decision making. In this paper, the basic principles of evolutionary multiobjective optimization
Comparison of Multiobjective Evolutionary Algorithms: Empirical Results
, 2000
"... In this paper, we provide a systematic comparison of various evolutionary approaches to multiobjective optimization using six carefully chosen test functions. Each test function involves a particular feature that is known to cause difficulty in the evolutionary optimization process, mainly in conver ..."
Abstract

Cited by 628 (41 self)
 Add to MetaCart
in converging to the Paretooptimal front (e.g., multimodality and deception). By investigating these different problem features separately, it is possible to predict the kind of problems to which a certain technique is or is not well suited. However, in contrast to what was suspected beforehand
Theory of the Firm: Managerial Behavior, Agency Costs and Ownership Structure
, 1976
"... This paper integrates elements from the theory of agency, the theory of property rights and the theory of finance to develop a theory of the ownership structure of the firm. We define the concept of agency costs, show its relationship to the ‘separation and control’ issue, investigate the nature of ..."
Abstract

Cited by 3043 (12 self)
 Add to MetaCart
This paper integrates elements from the theory of agency, the theory of property rights and the theory of finance to develop a theory of the ownership structure of the firm. We define the concept of agency costs, show its relationship to the ‘separation and control’ issue, investigate the nature
High dimensional graphs and variable selection with the Lasso
 ANNALS OF STATISTICS
, 2006
"... The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a ..."
Abstract

Cited by 736 (22 self)
 Add to MetaCart
is a computationally attractive alternative to standard covariance selection for sparse highdimensional graphs. Neighborhood selection estimates the conditional independence restrictions separately for each node in the graph and is hence equivalent to variable selection for Gaussian linear models. We
Distance metric learning for large margin nearest neighbor classification
 In NIPS
, 2006
"... We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin. On seven ..."
Abstract

Cited by 695 (14 self)
 Add to MetaCart
We show how to learn a Mahanalobis distance metric for knearest neighbor (kNN) classification by semidefinite programming. The metric is trained with the goal that the knearest neighbors always belong to the same class while examples from different classes are separated by a large margin
The spandrels of San Marco and the Panglossian paradigm: a critique of the adaptationist programme.
 Proceedings of the Royal Society of London Series B, Biological Sciences
, 1979
"... An adaptationist programme has dominated evolutionary thought in England and the United States during the past 40 years. It is based on faith in the power of natural selection as an optimizing agent. It proceeds by breaking an organism into unitary 'traits' and proposing an adaptive story ..."
Abstract

Cited by 538 (2 self)
 Add to MetaCart
story for each considered separately. Tradeoffs among competing selective demands exert the onlv brake upon perfection; nonoptimality is thereby rendered as a result of adaptation as well. We criticize this approach and attempt to reassert a competing notion (long popular in continental Europe
Large margin methods for structured and interdependent output variables
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2005
"... Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary ..."
Abstract

Cited by 624 (12 self)
 Add to MetaCart
to accomplish this, we propose to appropriately generalize the wellknown notion of a separation margin and derive a corresponding maximummargin formulation. While this leads to a quadratic program with a potentially prohibitive, i.e. exponential, number of constraints, we present a cutting plane algorithm
Results 1  10
of
13,059