• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,456,170
Next 10 →

Article Filter-Type Variable Selection Based on Information Measures for Regression Tasks

by Pedro Latorre Carmona, José Martínez Sotoca, Filiberto Pla , 2012
"... entropy ..."
Abstract - Add to MetaCart
Abstract not found

An introduction to variable and feature selection

by Isabelle Guyon - Journal of Machine Learning Research , 2003
"... Variable and feature selection have become the focus of much research in areas of application for which datasets with tens or hundreds of thousands of variables are available. ..."
Abstract - Cited by 1283 (16 self) - Add to MetaCart
Variable and feature selection have become the focus of much research in areas of application for which datasets with tens or hundreds of thousands of variables are available.

Regression quantiles

by Roger Koenker, Gilbert Bassett - Econometrica , 1978
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract - Cited by 870 (19 self) - Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at

Regression Shrinkage and Selection Via the Lasso

by Robert Tibshirani - Journal of the Royal Statistical Society, Series B , 1994
"... We propose a new method for estimation in linear models. The "lasso" minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant. Because of the nature of this constraint it tends to produce some coefficients that are exactl ..."
Abstract - Cited by 4055 (51 self) - Add to MetaCart
that are exactly zero and hence gives interpretable models. Our simulation studies suggest that the lasso enjoys some of the favourable properties of both subset selection and ridge regression. It produces interpretable models like subset selection and exhibits the stability of ridge regression. There is also

Least angle regression

by Bradley Efron, Trevor Hastie, Iain Johnstone, Robert Tibshirani - Ann. Statist
"... The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to s ..."
Abstract - Cited by 1308 (43 self) - Add to MetaCart
to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm

Quantile Regression

by Roger Koenker, Kevin F. Hallock - JOURNAL OF ECONOMIC PERSPECTIVES—VOLUME 15, NUMBER 4—FALL 2001—PAGES 143–156 , 2001
"... We say that a student scores at the fifth quantile of a standardized exam if he performs better than the proportion � of the reference group of students and worse than the proportion (1–�). Thus, half of students perform better than the median student and half perform worse. Similarly, the quartiles ..."
Abstract - Cited by 937 (10 self) - Add to MetaCart
as introduced by Koenker and Bassett (1978) seeks to extend these ideas to the estimation of conditional quantile functions—models in which quantiles of the conditional distribution of the response variable are expressed as functions of observed covariates. In Figure 1, we illustrate one approach to this task

Applied Nonparametric Regression

by Wolfgang Härdle , 1994
"... ..."
Abstract - Cited by 810 (10 self) - Add to MetaCart
Abstract not found

Regression Models for Categorical Dependent Variables Using Stata

by J. Scott Long, Jeremy Freese , 2001
"... . ..."
Abstract - Cited by 767 (4 self) - Add to MetaCart
Abstract not found

Projection Pursuit Regression

by Jerome H. Friedman, Werner Stuetzle - Journal of the American Statistical Association , 1981
"... A new method for nonparametric multiple regression is presented. The procedure models the regression surface as a sum of general- smooth functions of linear combinations of the predictor variables in an iterative manner. It is more general than standard stepwise and stagewise regression procedures, ..."
Abstract - Cited by 555 (6 self) - Add to MetaCart
A new method for nonparametric multiple regression is presented. The procedure models the regression surface as a sum of general- smooth functions of linear combinations of the predictor variables in an iterative manner. It is more general than standard stepwise and stagewise regression procedures

The unity and diversity of executive functions and their contributions to complex “Frontal Lobe” tasks: a latent variable analysis

by Akira Miyake, Naomi P. Friedman, Michael J. Emerson, Er H. Witzki, Amy Howerter, Tor D. Wager, John Duncan, Priti Shah - Cognit Psychol , 2000
"... This individual differences study examined the separability of three often postu-lated executive functions—mental set shifting (‘‘Shifting’’), information updating and monitoring (‘‘Updating’’), and inhibition of prepotent responses (‘‘Inhibi-tion’’)—and their roles in complex ‘‘frontal lobe’ ’ or ‘ ..."
Abstract - Cited by 626 (9 self) - Add to MetaCart
This individual differences study examined the separability of three often postu-lated executive functions—mental set shifting (‘‘Shifting’’), information updating and monitoring (‘‘Updating’’), and inhibition of prepotent responses (‘‘Inhibi-tion’’)—and their roles in complex ‘‘frontal lobe
Next 10 →
Results 1 - 10 of 1,456,170
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University