Results 1 - 10
of
31,592
with Increasingly Many Parameters
, 2013
"... Economics, University of Essex, for private circulation to interested readers. They often represent preliminary reports on work in progress and should therefore be neither quoted nor ..."
Abstract
- Add to MetaCart
Economics, University of Essex, for private circulation to interested readers. They often represent preliminary reports on work in progress and should therefore be neither quoted nor
Text Categorization with Support Vector Machines: Learning with Many Relevant Features
, 1998
"... This paper explores the use of Support Vector Machines (SVMs) for learning text classifiers from examples. It analyzes the particular properties of learning with text data and identifies, why SVMs are appropriate for this task. Empirical results support the theoretical findings. SVMs achieve substan ..."
Abstract
-
Cited by 2303 (9 self)
- Add to MetaCart
substantial improvements over the currently best performing methods and they behave robustly over a variety of different learning tasks. Furthermore, they are fully automatic, eliminating the need for manual parameter tuning.
Optimal prediction for linear regression with infinitely many parameters
, 1999
"... The problem of optimal prediction in the stochastic linear regression model with infinitely many parameters is considered. We suggest a prediction method that outperforms asymptotically the ordinary least squares predictor. Moreover, if the random errors are Gaussian, the method is asymptotically mi ..."
Abstract
- Add to MetaCart
The problem of optimal prediction in the stochastic linear regression model with infinitely many parameters is considered. We suggest a prediction method that outperforms asymptotically the ordinary least squares predictor. Moreover, if the random errors are Gaussian, the method is asymptotically
CHaMP: Creating Heuristics via Many Parameters
"... The online bin packing problem is a well-known bin packing variant which requires immediate decisions to be made for the placement of a sequence of arriving items of various sizes one at a time into fixed capacity bins without any overflow. The overall goal is maximising the average bin fullness. We ..."
Abstract
- Add to MetaCart
. We investigate a ‘policy matrix ’ representation which assigns a score for each decision option independently and the option with the highest value is cho-sen for one dimensional online bin packing. A policy matrix might also be considered as a heuristic with many parameters, where each parameter
HOW MANY PARAMETERS TO MODEL STATES OF MIND?
"... ABSTRACT A series of examples of computational models is provided, where the model aim is to interpret numerical results in terms of internal states of agents' minds. Two opposite strategies or research can be distinguished in the literature. First is to reproduce the richness and complexity o ..."
Abstract
- Add to MetaCart
of the rich set of underlying assumptions remain unchecked. Here we argue that for computational reasons, complex models with many parameters are less suitable.
Optimal Prediction for Linear Regression With Infinitely Many Parameters
- Pr'epublication n 541 du Laboratoire de Probabilit'es & Mod`eles Al'eatoires, Universit'e Paris VI, http://www.proba.jussieu.fr
, 1999
"... The problem of optimal prediction in the stochastic linear regression model with infinitely many parameters is considered. We suggest a prediction method that is asymptotically minimax over ellipsoids in ` 2 . The method is based on a regularized least squares estimator with weights of the Pinsker f ..."
Abstract
-
Cited by 6 (1 self)
- Add to MetaCart
The problem of optimal prediction in the stochastic linear regression model with infinitely many parameters is considered. We suggest a prediction method that is asymptotically minimax over ellipsoids in ` 2 . The method is based on a regularized least squares estimator with weights of the Pinsker
Power-law distributions in empirical data
- ISSN 00361445. doi: 10.1137/ 070710111. URL http://dx.doi.org/10.1137/070710111
, 2009
"... Power-law distributions occur in many situations of scientific interest and have significant consequences for our understanding of natural and man-made phenomena. Unfortunately, the empirical detection and characterization of power laws is made difficult by the large fluctuations that occur in the t ..."
Abstract
-
Cited by 607 (7 self)
- Add to MetaCart
Power-law distributions occur in many situations of scientific interest and have significant consequences for our understanding of natural and man-made phenomena. Unfortunately, the empirical detection and characterization of power laws is made difficult by the large fluctuations that occur
Where the REALLY Hard Problems Are
- IN J. MYLOPOULOS AND R. REITER (EDS.), PROCEEDINGS OF 12TH INTERNATIONAL JOINT CONFERENCE ON AI (IJCAI-91),VOLUME 1
, 1991
"... It is well known that for many NP-complete problems, such as K-Sat, etc., typical cases are easy to solve; so that computationally hard cases must be rare (assuming P != NP). This paper shows that NP-complete problems can be summarized by at least one "order parameter", and that the hard p ..."
Abstract
-
Cited by 683 (1 self)
- Add to MetaCart
It is well known that for many NP-complete problems, such as K-Sat, etc., typical cases are easy to solve; so that computationally hard cases must be rare (assuming P != NP). This paper shows that NP-complete problems can be summarized by at least one "order parameter", and that the hard
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
, 2001
"... Variable selection is fundamental to high-dimensional statistical modeling, including nonparametric regression. Many approaches in use are stepwise selection procedures, which can be computationally expensive and ignore stochastic errors in the variable selection process. In this article, penalized ..."
Abstract
-
Cited by 948 (62 self)
- Add to MetaCart
Variable selection is fundamental to high-dimensional statistical modeling, including nonparametric regression. Many approaches in use are stepwise selection procedures, which can be computationally expensive and ignore stochastic errors in the variable selection process. In this article, penalized
Bayesian Interpolation
- NEURAL COMPUTATION
, 1991
"... Although Bayesian analysis has been in use since Laplace, the Bayesian method of model--comparison has only recently been developed in depth. In this paper, the Bayesian approach to regularisation and model--comparison is demonstrated by studying the inference problem of interpolating noisy data. T ..."
Abstract
-
Cited by 728 (17 self)
- Add to MetaCart
. The concepts and methods described are quite general and can be applied to many other problems. Regularising constants are set by examining their posterior probability distribution. Alternative regularisers (priors) and alternative basis sets are objectively compared by evaluating the evidence for them
Results 1 - 10
of
31,592