Results 1  10
of
1,847,383
Wrappers for Feature Subset Selection
 AIJ SPECIAL ISSUE ON RELEVANCE
, 1997
"... In the feature subset selection problem, a learning algorithm is faced with the problem of selecting a relevant subset of features upon which to focus its attention, while ignoring the rest. To achieve the best possible performance with a particular learning algorithm on a particular training set, a ..."
Abstract

Cited by 1522 (3 self)
 Add to MetaCart
, a feature subset selection method should consider how the algorithm and the training set interact. We explore the relation between optimal feature subset selection and relevance. Our wrapper method searches for an optimal feature subset tailored to a particular algorithm and a domain. We study
A Niched Pareto Genetic Algorithm for Multiobjective Optimization
 IN PROCEEDINGS OF THE FIRST IEEE CONFERENCE ON EVOLUTIONARY COMPUTATION, IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE
, 1994
"... Many, if not most, optimization problems have multiple objectives. Historically, multiple objectives have been combined ad hoc to form a scalar objective function, usually through a linear combination (weighted sum) of the multiple attributes, or by turning objectives into constraints. The genetic a ..."
Abstract

Cited by 395 (6 self)
 Add to MetaCart
algorithm (GA), however, is readily modified to deal with multiple objectives by incorporating the concept of Pareto domination in its selection operator, and applying a niching pressure to spread its population out along the Pareto optimal tradeoff surface. We introduce the Niched Pareto GA as an algorithm
Irrelevant Features and the Subset Selection Problem
 MACHINE LEARNING: PROCEEDINGS OF THE ELEVENTH INTERNATIONAL
, 1994
"... We address the problem of finding a subset of features that allows a supervised induction algorithm to induce small highaccuracy concepts. We examine notions of relevance and irrelevance, and show that the definitions used in the machine learning literature do not adequately partition the features ..."
Abstract

Cited by 741 (26 self)
 Add to MetaCart
into useful categories of relevance. We present definitions for irrelevance and for two degrees of relevance. These definitions improve our understanding of the behavior of previous subset selection algorithms, and help define the subset of features that should be sought. The features selected should depend
Multiobjective evolutionary algorithms: a comparative case study and the strength pareto approach
 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
, 1999
"... Evolutionary algorithms (EA’s) are often wellsuited for optimization problems involving several, often conflicting objectives. Since 1985, various evolutionary approaches to multiobjective optimization have been developed that are capable of searching for multiple solutions concurrently in a singl ..."
Abstract

Cited by 781 (22 self)
 Add to MetaCart
. The proofofprinciple results obtained on two artificial problems as well as a larger problem, the synthesis of a digital hardware–software multiprocessor system, suggest that SPEA can be very effective in sampling from along the entire Paretooptimal front and distributing the generated solutions over
Toward optimal feature selection
 In 13th International Conference on Machine Learning
, 1995
"... In this paper, we examine a method for feature subset selection based on Information Theory. Initially, a framework for de ning the theoretically optimal, but computationally intractable, method for feature subset selection is presented. We show that our goal should be to eliminate a feature if it g ..."
Abstract

Cited by 472 (9 self)
 Add to MetaCart
In this paper, we examine a method for feature subset selection based on Information Theory. Initially, a framework for de ning the theoretically optimal, but computationally intractable, method for feature subset selection is presented. We show that our goal should be to eliminate a feature
Regression Shrinkage and Selection Via the Lasso
 Journal of the Royal Statistical Society, Series B
, 1994
"... We propose a new method for estimation in linear models. The "lasso" minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant. Because of the nature of this constraint it tends to produce some coefficients that are exactl ..."
Abstract

Cited by 4055 (51 self)
 Add to MetaCart
that are exactly zero and hence gives interpretable models. Our simulation studies suggest that the lasso enjoys some of the favourable properties of both subset selection and ridge regression. It produces interpretable models like subset selection and exhibits the stability of ridge regression. There is also
Multiobjective Optimization Using Nondominated Sorting in Genetic Algorithms
 Evolutionary Computation
, 1994
"... In trying to solve multiobjective optimization problems, many traditional methods scalarize the objective vector into a single objective. In those cases, the obtained solution is highly sensitive to the weight vector used in the scalarization process and demands the user to have knowledge about t ..."
Abstract

Cited by 524 (4 self)
 Add to MetaCart
the underlying problem. Moreover, in solving multiobjective problems, designers may be interested in a set of Paretooptimal points, instead of a single point. Since genetic algorithms(GAs) work with a population of points, it seems natural to use GAs in multiobjective optimization problems to capture a
Genetic Algorithms for Multiobjective Optimization: Formulation, Discussion and Generalization
, 1993
"... The paper describes a rankbased fitness assignment method for Multiple Objective Genetic Algorithms (MOGAs). Conventional niche formation methods are extended to this class of multimodal problems and theory for setting the niche size is presented. The fitness assignment method is then modified to a ..."
Abstract

Cited by 610 (15 self)
 Add to MetaCart
to allow direct intervention of an external decision maker (DM). Finally, the MOGA is generalised further: the genetic algorithm is seen as the optimizing element of a multiobjective optimization loop, which also comprises the DM. It is the interaction between the two that leads to the determination of a
An introduction to variable and feature selection
 Journal of Machine Learning Research
, 2003
"... Variable and feature selection have become the focus of much research in areas of application for which datasets with tens or hundreds of thousands of variables are available. ..."
Abstract

Cited by 1283 (16 self)
 Add to MetaCart
Variable and feature selection have become the focus of much research in areas of application for which datasets with tens or hundreds of thousands of variables are available.
Optimization Flow Control, I: Basic Algorithm and Convergence
 IEEE/ACM TRANSACTIONS ON NETWORKING
, 1999
"... We propose an optimization approach to flow control where the objective is to maximize the aggregate source utility over their transmission rates. We view network links and sources as processors of a distributed computation system to solve the dual problem using gradient projection algorithm. In thi ..."
Abstract

Cited by 690 (64 self)
 Add to MetaCart
We propose an optimization approach to flow control where the objective is to maximize the aggregate source utility over their transmission rates. We view network links and sources as processors of a distributed computation system to solve the dual problem using gradient projection algorithm
Results 1  10
of
1,847,383