Results 1  10
of
64
Constructive Incremental Learning from Only Local Information
, 1998
"... ... This article illustrates the potential learning capabilities of purely local learning and offers an interesting and powerful approach to learning with receptive fields. ..."
Abstract

Cited by 175 (37 self)
 Add to MetaCart
... This article illustrates the potential learning capabilities of purely local learning and offers an interesting and powerful approach to learning with receptive fields.
Learning Machines
, 1965
"... This book is about machines that learn to discover hidden relationships in data. A constant sfream of data bombards our senses and millions of sensory channels carry information into our brains. Brains are also learning machines that condition, ..."
Abstract

Cited by 155 (0 self)
 Add to MetaCart
(Show Context)
This book is about machines that learn to discover hidden relationships in data. A constant sfream of data bombards our senses and millions of sensory channels carry information into our brains. Brains are also learning machines that condition,
Hoeffding Races: Accelerating Model Selection Search for Classification and Function Approximation
 In Advances in neural information processing systems 6
, 1994
"... Selecting a good model of a set of input points by cross validation is a computationally intensive process, especially if the number of possible models or the number of training points is high. Techniques such as gradient descent are helpful in searching through the space of models, but problems suc ..."
Abstract

Cited by 107 (9 self)
 Add to MetaCart
Selecting a good model of a set of input points by cross validation is a computationally intensive process, especially if the number of possible models or the number of training points is high. Techniques such as gradient descent are helpful in searching through the space of models, but problems such as local minima, and more importantly, lack of a distance metric between various models reduce the applicability of these search methods. Hoeffding Races is a technique for finding a good model for the data by quickly discarding bad models, and concentrating the computational effort at differentiating between the better ones. This paper focuses on the special case of leaveoneout cross validation applied to memorybased learning algorithms, but we also argue that it is applicable to any class of model selection problems. 1 Introduction Model selection addresses "high level" decisions about how best to tune learning algorithm architectures for particular tasks. Such decisions include which...
Improving Regression Estimation: Averaging Methods for Variance Reduction with Extensions to General Convex Measure Optimization
, 1993
"... ..."
Prediction risk and architecture selection for neural networks
, 1994
"... Abstract. We describe two important sets of tools for neural network modeling: prediction risk estimation and network architecture selection. Prediction risk is defined as the expected performance of an estimator in predicting new observations. Estimated prediction risk can be used both for estimati ..."
Abstract

Cited by 77 (2 self)
 Add to MetaCart
Abstract. We describe two important sets of tools for neural network modeling: prediction risk estimation and network architecture selection. Prediction risk is defined as the expected performance of an estimator in predicting new observations. Estimated prediction risk can be used both for estimating the quality of model predictions and for model selection. Prediction risk estimation and model selection are especially important for problems with limited data. Techniques for estimating prediction risk include data resampling algorithms such as nonlinear cross–validation (NCV) and algebraic formulae such as the predicted squared error (PSE) and generalized prediction error (GPE). We show that exhaustive search over the space of network architectures is computationally infeasible even for networks of modest size. This motivates the use of heuristic strategies that dramatically reduce the search complexity. These strategies employ directed search algorithms, such as selecting the number of nodes via sequential network construction (SNC) and pruning inputs and weights via sensitivity based pruning (SBP) and optimal brain damage (OBD) respectively.
Bias and Variance of Validation Methods for Function Approximation Neural Networks Under Conditions of Sparse Data
 IEEE Transactions on Systems, Man, and Cybernetics, Part C
, 1998
"... Neural networks must be constructed and validated with strong empirical dependence, which is difficult under conditions of sparse data. This paper examines the most common methods of neural network validation along with several general validation methods from the statistical resampling literature ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
(Show Context)
Neural networks must be constructed and validated with strong empirical dependence, which is difficult under conditions of sparse data. This paper examines the most common methods of neural network validation along with several general validation methods from the statistical resampling literature as applied to function approximation networks with small sample sizes. It is shown that an increase in computation, necessary for the statistical resampling methods, produces networks that perform better than those constructed in the traditional manner. The statistical resampling methods also result in lower variance of validation, however some of the methods are biased in estimating network error. 1. INTRODUCTION To be beneficial, system models must be validated to assure the users that the model emulates the actual system in the desired manner. This is especially true of empirical models, such as neural network and statistical models, which rely primarily on observed data rather th...
A Survey on Spherical Spline Approximation
 Surveys Math. Indust
, 1997
"... Spline functions that approximate data given on the sphere are developed in a weighted Sobolev space setting. The flexibility of the weights makes possible the choice of the approximating function in a way which emphasizes attributes desirable for the particular application area. Examples show that ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
(Show Context)
Spline functions that approximate data given on the sphere are developed in a weighted Sobolev space setting. The flexibility of the weights makes possible the choice of the approximating function in a way which emphasizes attributes desirable for the particular application area. Examples show that certain choices of the weight sequences yield known methods. A convergence theorem containing explicit constants yields a usable error bound. Our survey ends with the discussion of spherical splines in geodetically relevant pseudodifferential equations. (submitted to "Surveys on Mathematics for Industry") AMS classification: 41A05, 43A90, 65D07, 86A30 Keywords: spherical splines, scattered data interpolation, smoothing, geoid determination Contents 1 Introduction 3 2 Preliminaries 4 3 Sobolev Spaces and Pseudodifferential Operators 7 4 Sobolev Lemma and Reproducing Kernel Sobolev Spaces 10 5 Examples of Radial Basis Functions 13 5.1 Green's Kernels Corresponding to Iterated Beltrami Oper...
Adaptive spectral methods for simulation output analysis
 IBM Journal of Research and Development
, 1981
"... This paper addresses two central problems in simulation methodology: the generation of conjidence intervals for the steady state means of the output sequences and the sequential of use these conjidence intervals to control the run length. The variance of the sample mean of a covariance stationary pr ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
(Show Context)
This paper addresses two central problems in simulation methodology: the generation of conjidence intervals for the steady state means of the output sequences and the sequential of use these conjidence intervals to control the run length. The variance of the sample mean of a covariance stationary process is given approximately by p(O)lN, where p(f) is the spectral density at frequency f and N is the sample size. In earlier an paper we developed a method of confidence interval generation based on the estimation of p(0) through the least squares jit of a quadratic to the logarithm of the periodogram. This method was applied in a run length Control procedure to a sequence of batched means. As the run length increased the batch means were rebatched into larger batch sizes so as to limit storage requirements. In this rebatching the shape of the spectral density changes, gradually becoming flat as N increases. Quadratics were chosen as a compromise between small sample bias and large sample stability. In this paper we consider smoothing techniques which adapt to the changing spectral shape in an attempt to improve both the small and large sample behavior of the method. The techniques considered are polynomial smoothing with the degree selected sequentially using standard regression statistics, polynomial smoothing with the degree selected by cross validation, and smoothing splines with the amount of smoothing determined by cross validation. These techniques were empirically evaluated both for fixed sample sizes and when incorporated into the sequential run length control procedure. Forjixed sample sizes they did not improve the small sample behavior and only marginally improved the large sample behavior when compared with the quadratic method. Their performance in the sequential procedure was unsatisfactory. Hence, the straightforward quadratic technique recommended in the earlier paper is still recommended as an effective, practical technique for simulation conjidence interval generation and run length control. 1.
Objectbased 3D reconstruction of arterial trees from magnetic resonance angiograms
 IEEE Transactions on Medical Imaging
, 1991
"... This paper describes an objectbased approach to the problem of reconstructing threedimensional descriptions of arterial trees from a few angiographic projections. The method incorporates aprioriknowledge of the structure of branching arteries into a natural optimality criterion that encompasses ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
This paper describes an objectbased approach to the problem of reconstructing threedimensional descriptions of arterial trees from a few angiographic projections. The method incorporates aprioriknowledge of the structure of branching arteries into a natural optimality criterion that encompasses the entire arterial tree. This global approach enables reconstruction from a few noisy projection images. We present an efficient optimization algorithm for object estimation, and demonstrate its performance on simulated, phantom, and in vivo magnetic resonance angiograms.