Results 1  10
of
49
Regularization Theory and Neural Networks Architectures
 Neural Computation
, 1995
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Ba ..."
Abstract

Cited by 398 (33 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Basis Functions approximation schemes. This paper shows that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks. In particular, we introduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same generalization that extends Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions, som...
Generalized Partially Linear SingleIndex Models
 Journal of the American Statistical Association
, 1998
"... The typical generalized linear model for a regression of a response Y on predictors (X; Z) has conditional mean function based upon a linear combination of (X; Z). We generalize these models to have a nonparametric component, replacing the linear combination T 0 X + T 0 Z by 0 ( T 0 X) + T 0 Z, wher ..."
Abstract

Cited by 122 (30 self)
 Add to MetaCart
The typical generalized linear model for a regression of a response Y on predictors (X; Z) has conditional mean function based upon a linear combination of (X; Z). We generalize these models to have a nonparametric component, replacing the linear combination T 0 X + T 0 Z by 0 ( T 0 X) + T 0 Z, where 0 ( ) is an unknown function. We call these generalized partially linear singleindex models (GPLSIM). The models include the "singleindex" models, which have 0 = 0. Using local linear methods, estimates of the unknown parameters ( 0 ; 0 ) and the unknown function 0 ( ) are proposed, and their asymptotic distributions obtained. Examples illustrate the models and the proposed estimation methodology.
Linear smoothers and additive models
 The Annals of Statistics
, 1989
"... We study linear smoothers and their use in building nonparametric regression models. In part Qfthis paper we examine certain aspects of linear smoothers for scatterplots; examples of these are the running mean and running line, kernel, and cubic spline smoothers. The eigenvalue and singular value d ..."
Abstract

Cited by 99 (2 self)
 Add to MetaCart
We study linear smoothers and their use in building nonparametric regression models. In part Qfthis paper we examine certain aspects of linear smoothers for scatterplots; examples of these are the running mean and running line, kernel, and cubic spline smoothers. The eigenvalue and singular value decompositions of the corresponding smoother matrix are used to qualitatively describe a smoother, and several other topics such as the number of degrees of freedom of a smoother are discussed. In the second part of the paper we describe how Iinearsmoothers can be used to estimate the additive model, a powerful nonparametric regression model, using the "backfitting algorithm". We study the convergence of the backfitting algorithm and prove its convergence for a class of smoothers that includes cubic e:ttJlCl€~nt jJI:::Jll<l.li:6I;:U least squares. algorithm and ' dis.cuss ev'W()r(is: Neaparametric, seanparametric, regression, GaussSeidelalgorithm,
Priors, Stabilizers and Basis Functions: from regularization to radial, tensor and additive splines
, 1993
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular we had discussed how standard smoothness functionals lead to a subclass of regularization networks, th ..."
Abstract

Cited by 86 (15 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular we had discussed how standard smoothness functionals lead to a subclass of regularization networks, the wellknown Radial Basis Functions approximation schemes. In this paper weshow that regularization networks encompass amuch broader range of approximation schemes, including many of the popular general additivemodels and some of the neural networks. In particular weintroduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same extension that leads from Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additivemodels to ridge approximation models, containing as special cases Breiman's hinge functions and some forms of Projection Pursuit Regression. We propose to use the term GeneralizedRegularization Networks for this broad class of approximation schemes that follow from an extension of regularization. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to differenttypes of smoothness assumptions. In the final part of the paper, weshow the relation between activation functions of the Gaussian and sigmoidal type by considering the simple case of the kernel G(x)=x. In summary,
Uncertain Reasoning and Forecasting
 International Journal of Forecasting
, 1995
"... We develop a probability forecasting model through a synthesis of Bayesian beliefnetwork models and classical timeseries analysis. By casting Bayesian timeseries analyses as temporal beliefnetwork problems, weintroduce dependency models that capture richer and more realistic models of dynamic ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
(Show Context)
We develop a probability forecasting model through a synthesis of Bayesian beliefnetwork models and classical timeseries analysis. By casting Bayesian timeseries analyses as temporal beliefnetwork problems, weintroduce dependency models that capture richer and more realistic models of dynamic dependencies. With richer models and associated computational methods, we can movebeyond the rigid classical assumptions of linearityin the relationships among variables and of normality of their probability distributions.
A Bayesian Approach to Robust Binary Nonparametric Regression
, 1997
"... This paper presents a Bayesian approach to binary nonparametric regression which assumes that the argument of the link is an additive function of the explanatory variables and their multiplicative interactions. The paper makes the following contributions. First, a comprehensive approach is presented ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
This paper presents a Bayesian approach to binary nonparametric regression which assumes that the argument of the link is an additive function of the explanatory variables and their multiplicative interactions. The paper makes the following contributions. First, a comprehensive approach is presented in which the function estimates are smoothing splines with the smoothing parameters integrated out, and the estimates made robust to outliers. Second, the approach can handle a wide rage of link functions. Third, efficient state space based algorithms are used to carry out the computations. Fourth, an extensive set of simulations is carried out which show that the Bayesian estimator works well and compares favorably to two estimators which are widely used in practice.
A Local Polynomial Jump Detection Algorithm In Nonparametric Regression
 Technometrics
, 1998
"... We suggest a one dimensional jump detection algorithm based on local polynomial fitting for jumps in regression functions (zeroorder jumps) or jumps in derivatives (firstorder or higherorder jumps). If jumps exist in the mth order derivative of the underlying regression function, then an (m + 1) ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
We suggest a one dimensional jump detection algorithm based on local polynomial fitting for jumps in regression functions (zeroorder jumps) or jumps in derivatives (firstorder or higherorder jumps). If jumps exist in the mth order derivative of the underlying regression function, then an (m + 1) order polynomial is fitted in a neighborhood of each design point. We then characterize the jump information in the coefficients of the highest order terms of the fitted polynomials and suggest an algorithm for jump detection. This method is introduced briefly for the general setup and then presented in detail for zeroorder and firstorder jumps. Several simulation examples are discussed. We apply this method to the Bombay (India) sealevel pressure data. Key Words: Nonparametric jump regression model, Jump detection algorithm, Least squares line, Threshold value, Modification procedure, Image processing, Edge detection. 1 Introduction Stock market prices often jump up or down under the in...
A Nonparametric Multiplicative Hazard Model for Event History Analysis  Revised
, 1995
"... this paper we propose and develop a nonparametric multiplicative hazard model that takes into account these aspects. Embedded in the counting process framework, estimation is based on penalized likelihoods and splines. We illustrate our approach by an application to sleepelectroencephalography data ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
this paper we propose and develop a nonparametric multiplicative hazard model that takes into account these aspects. Embedded in the counting process framework, estimation is based on penalized likelihoods and splines. We illustrate our approach by an application to sleepelectroencephalography data with multiple recurrent states of human sleep.
New approaches to regression by Generalized Additive Models and continuous optimization for modern applications in finance, science and techology
 the special issue of Optimization at the occasion of the 5th Ballarat Workshop on Global and NonSmooth Optimization: Theory, Methods and Applications
"... Generalized additive models belong to modern techniques from statistical learning, and are applicable in many areas of prediction, e.g., in financial mathematics, computational biology, medicine, chemistry and m G ( μ ( X)) = ψ X = β + ∑ f X, where ψ is environmental protection. These models have t ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
(Show Context)
Generalized additive models belong to modern techniques from statistical learning, and are applicable in many areas of prediction, e.g., in financial mathematics, computational biology, medicine, chemistry and m G ( μ ( X)) = ψ X = β + ∑ f X, where ψ is environmental protection. These models have the form ( ) 0 j ( j) a function of the predictors. These models are fitted through local scoring algorithm using a scatterplot smoother as building blocks proposed by Hastie and Tibshirani (1987). In this paper, we first give a short introduction and review. Then, we present a mathematical modeling by splines based on a new clustering approach for the input data x, their density, and the variation of the output data y. We contribute to regression with generalized additive models by bounding (penalizing) second order terms (curvature) of the splines, leading to a more robust approximation. Previously, we proposed a refining modification and investigation of the backfitting algorithm, applied to additive models. Then, because of drawbacks of the modified backfitting algorithm, we solve this problem using continuous optimization techniques which will become an important complementary technology and alternative to the concept of modified backfitting algorithm [24]. In particular, we model and treat the constrained residual sum of squares by the elegant framework of conic quadratic programming.