• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 51,332
Next 10 →

A tutorial on hidden Markov models and selected applications in speech recognition

by Lawrence R. Rabiner - PROCEEDINGS OF THE IEEE , 1989
"... Although initially introduced and studied in the late 1960s and early 1970s, statistical methods of Markov source or hidden Markov modeling have become increasingly popular in the last several years. There are two strong reasons why this has occurred. First the models are very rich in mathematical s ..."
Abstract - Cited by 5892 (1 self) - Add to MetaCart
structure and hence can form the theoretical basis for use in a wide range of applications. Sec-ond the models, when applied properly, work very well in practice for several important applications. In this paper we attempt to care-fully and methodically review the theoretical aspects of this type

An introduction to variable and feature selection

by Isabelle Guyon - Journal of Machine Learning Research , 2003
"... Variable and feature selection have become the focus of much research in areas of application for which datasets with tens or hundreds of thousands of variables are available. ..."
Abstract - Cited by 1352 (16 self) - Add to MetaCart
Variable and feature selection have become the focus of much research in areas of application for which datasets with tens or hundreds of thousands of variables are available.

Feature selection: Evaluation, application, and small sample performance

by Anil Jain, Douglas Zongker - IEEE Transactions on Pattern Analysis and Machine Intelligence , 1997
"... Abstract—A large number of algorithms have been proposed for feature subset selection. Our experimental results show that the sequential forward floating selection (SFFS) algorithm, proposed by Pudil et al., dominates the other algorithms tested. We study the problem of choosing an optimal feature s ..."
Abstract - Cited by 474 (13 self) - Add to MetaCart
Abstract—A large number of algorithms have been proposed for feature subset selection. Our experimental results show that the sequential forward floating selection (SFFS) algorithm, proposed by Pudil et al., dominates the other algorithms tested. We study the problem of choosing an optimal feature

Irrelevant Features and the Subset Selection Problem

by George H. John, Ron Kohavi, Karl Pfleger - MACHINE LEARNING: PROCEEDINGS OF THE ELEVENTH INTERNATIONAL , 1994
"... We address the problem of finding a subset of features that allows a supervised induction algorithm to induce small high-accuracy concepts. We examine notions of relevance and irrelevance, and show that the definitions used in the machine learning literature do not adequately partition the features ..."
Abstract - Cited by 757 (26 self) - Add to MetaCart
not only on the features and the target concept, but also on the induction algorithm. We describe a method for feature subset selection using cross-validation that is applicable to any induction algorithm, and discuss experiments conducted with ID3 and C4.5 on artificial and real datasets.

Context-Aware Computing Applications

by Bill Schilit, Norman Adams, Roy Want , 1995
"... This paper describes systems that examine and react to an individual's changing context. Such systems can promote and mediate people's mleractlOns with devices, computers, and other people, and they can help navigate unfamiliar places. We bel1eve that a lunded amount of information coveTIn ..."
Abstract - Cited by 984 (6 self) - Add to MetaCart
selec-tion, automatic contextual reconfiguratlOn, contexlual information and commands, and context-triggered ac-tions. fnstances of these application types ha11e been prototyped on the PARCTAB, a wireless, palm-sl.:ed computer.

A comparative analysis of selection schemes used in genetic algorithms

by David E. Goldberg, Kalyanmoy Deb - Foundations of Genetic Algorithms , 1991
"... This paper considers a number of selection schemes commonly used in modern genetic algorithms. Specifically, proportionate reproduction, ranking selection, tournament selection, and Genitor (or «steady state") selection are compared on the basis of solutions to deterministic difference or d ..."
Abstract - Cited by 531 (31 self) - Add to MetaCart
This paper considers a number of selection schemes commonly used in modern genetic algorithms. Specifically, proportionate reproduction, ranking selection, tournament selection, and Genitor (or «steady state") selection are compared on the basis of solutions to deterministic difference

Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

by Jianqing Fan , Runze Li , 2001
"... Variable selection is fundamental to high-dimensional statistical modeling, including nonparametric regression. Many approaches in use are stepwise selection procedures, which can be computationally expensive and ignore stochastic errors in the variable selection process. In this article, penalized ..."
Abstract - Cited by 948 (62 self) - Add to MetaCart
that the newly proposed methods compare favorably with other variable selection techniques. Furthermore, the standard error formulas are tested to be accurate enough for practical applications.

Support Vector Machine Active Learning with Applications to Text Classification

by Simon Tong , Daphne Koller - JOURNAL OF MACHINE LEARNING RESEARCH , 2001
"... Support vector machines have met with significant success in numerous real-world learning tasks. However, like most machine learning algorithms, they are generally applied using a randomly selected training set classified in advance. In many settings, we also have the option of using pool-based acti ..."
Abstract - Cited by 735 (5 self) - Add to MetaCart
Support vector machines have met with significant success in numerous real-world learning tasks. However, like most machine learning algorithms, they are generally applied using a randomly selected training set classified in advance. In many settings, we also have the option of using pool

Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems

by Mário A. T. Figueiredo, Robert D. Nowak, Stephen J. Wright - IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING , 2007
"... Many problems in signal processing and statistical inference involve finding sparse solutions to under-determined, or ill-conditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a spa ..."
Abstract - Cited by 539 (17 self) - Add to MetaCart
sparseness-inducing (ℓ1) regularization term.Basis pursuit, the least absolute shrinkage and selection operator (LASSO), wavelet-based deconvolution, and compressed sensing are a few well-known examples of this approach. This paper proposes gradient projection (GP) algorithms for the bound

Evolutionary Computing

by A. E. Eiben , M. Schoenauer , 2005
"... Evolutionary computing (EC) is an exciting development in Computer Science. It amounts to building, applying and studying algorithms based on the Darwinian principles of natural selection. In this paper we briefly introduce the main concepts behind evolutionary computing. We present the main compone ..."
Abstract - Cited by 624 (35 self) - Add to MetaCart
Evolutionary computing (EC) is an exciting development in Computer Science. It amounts to building, applying and studying algorithms based on the Darwinian principles of natural selection. In this paper we briefly introduce the main concepts behind evolutionary computing. We present the main
Next 10 →
Results 1 - 10 of 51,332
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University