Results 1  10
of
15
Penalized Regression with ModelBased Penalties
, 2000
"... Nonparametric regression techniques such as spline smoothing and local fitting depend implicitly on a parametric model. For instance, the cubic smoothing spline estimate of a regression function based on observations t i ,Y i is the minimizer of # {Y i  (t i )} 2 + # # ( ## ) 2 .Since ..."
Abstract

Cited by 31 (2 self)
 Add to MetaCart
Nonparametric regression techniques such as spline smoothing and local fitting depend implicitly on a parametric model. For instance, the cubic smoothing spline estimate of a regression function based on observations t i ,Y i is the minimizer of # {Y i  (t i )} 2 + # # ( ## ) 2 .Since # ( ## ) 2 is zero when is a line, the cubic smoothing spline estimate favors the parametric model (t)=# 0+# 1 t. Here the authors consider replacing # ( ## ) 2 with the more general expression # (L) 2 where L is a linear di#erential operator with possibly nonconstant coe#cients. The resulting estimate of performs well, particularly if L is small. They present present a O(n) algorithm for the computation of . This algorithm is applicable to a wide class of L's. They also suggest a method for the estimation of L. They study our estimates via simulation and apply them to several data sets. R ESUM E Les techniques de regression non parametrique telles que l'ajustement local ou ...
A numerical procedure for filtering and efficient highorder signal differentiation
 Int. J. Appl. Math. Compt. Sci
, 2004
"... In this paper, we propose a numerical algorithm for filtering and robust signal differentiation. The numerical procedure is based on the solution of a simplified linear optimization problem. A compromise between smoothing and fidelity with respect to the measurable data is achieved by the computatio ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
In this paper, we propose a numerical algorithm for filtering and robust signal differentiation. The numerical procedure is based on the solution of a simplified linear optimization problem. A compromise between smoothing and fidelity with respect to the measurable data is achieved by the computation of an optimal regularization parameter that minimizes the Generalized Cross Validation criterion (GCV). Simulation results are given to highlight the effectiveness of the proposed procedure.
Nonparametric FixedInterval Smoothing with Vector Splines
, 1991
"... Spline smoothing has become a popular method for nonparametric exploration and estimation of scalarvalued functions, but its generalizations to vectorvalued functions have been underutilized. This paper presents a computationally efficient algorithm for nonparametric smoothing of vector signals wi ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
Spline smoothing has become a popular method for nonparametric exploration and estimation of scalarvalued functions, but its generalizations to vectorvalued functions have been underutilized. This paper presents a computationally efficient algorithm for nonparametric smoothing of vector signals with general measurement eovariances. This new algorithm provides an alternative to the prevalent "optimal" smoothing algorithms that hinge on (possibly inaccurate) parametric statespace models. We develop and compare automatic procedures that use the measurements to determine how much to smooth; this adaptation allows the data to "speak for itself" without imposing a GanssMarkov model structure. We present a nonpara metric approach to covariance estimation for the case of i.i.d, measurement errors. Monte Carlo simulations demonstrate the performance of the algorithm.
The Theory and Application of Penalized Least Squares Methods or Reproducing Kernel Hilbert Spaces Made Easy
, 1997
"... The popular cubic smoothing spline estimate of a regression function is the minimizer of X j d j (Y j \Gamma ¯(t j )) 2 + Z b a h ¯ 00 (t) i 2 dt; where (Y j ; t j ) are the data and the d j 's are positive weights. However, sometimes the data are related to the function of interes ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The popular cubic smoothing spline estimate of a regression function is the minimizer of X j d j (Y j \Gamma ¯(t j )) 2 + Z b a h ¯ 00 (t) i 2 dt; where (Y j ; t j ) are the data and the d j 's are positive weights. However, sometimes the data are related to the function of interest ¯ in another way, i.e., E(Y i ) = F i (¯) for some known F i 's. And sometimes, one may wish to replace R (¯ 00 ) 2 with another expression. This paper discusses the solution for these generalizations, that is, the minimization of X j d j (Y j \Gamma L j (¯)) 2 + Z b a h (L¯)(t) i 2 dt: Here, L is a linear differential operator of order m 1: (L¯)(t) = ¯ (m) (t) + P m\Gamma1 j=0 w j (t)¯ (j) (t). This paper outlines basic theory for this general minimization problem, and provides explicit directions for calculating the minimizer. The minimizer depends on the easily calculated reproducing kernel associated with L. 2 Introduction The cubic smoothing spline, a popular regre...
A Spline Approximation Of A Large Set Of Points
, 2000
"... This paper presents a spline approximation method for the representation of a large set of points. The representation should be smooth with preserving important shape characteristics given by the points. Because of a large size of the set, the standard spline interpolation cannot be used. The pro ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper presents a spline approximation method for the representation of a large set of points. The representation should be smooth with preserving important shape characteristics given by the points. Because of a large size of the set, the standard spline interpolation cannot be used. The proposed method is based on a least squares minimization of the distances of the points from the spline function subject to the conditions of smoothness of the representation. The spline approximation produces accurate and suitable representation of the points. The proposed approach has been verified on both synthetic and real data sets of points. Keywords: spline approximation, fitting, least squares 1
NEWI'ON'S METHOD be derived quite naturally from Newton's method. Newton's Method
"... ABSTRACT Newton's method plays a central role in the development of numerical techniques for optimization. In fact, most of the current practical methods for optimization can be viewed as variations on Newton's method. It is therefore important to understand Newton's nethod as an alg ..."
Abstract
 Add to MetaCart
ABSTRACT Newton's method plays a central role in the development of numerical techniques for optimization. In fact, most of the current practical methods for optimization can be viewed as variations on Newton's method. It is therefore important to understand Newton's nethod as an algorithm in its own right and as a key introduction to the most recent ideas in this area. One of the aims of this expository paper is to present &nd analyze two main approaches to Newton's method for unconstrained minimization: the line search approach and the trust region approach. The other aim is to present some of the recent developments in the optimization field which are related to Newton's method. In particular, we explore several variations on Newton's method which are appropriate for large scale problems, and we also show how quasiNewton methods can
of Philosophy.
, 1999
"... that I have read this thesis and that in my opinion it is fully adequate, ..."
Abstract
 Add to MetaCart
(Show Context)
that I have read this thesis and that in my opinion it is fully adequate,
Multiresolution Analysis
"... Summary. Multiresolution analysis has received considerable attention in recent years by researchers in the fields of computer graphics, geometric modeling and visualization. They are now considered a powerful tool for efficiently representing functions at multiple levelsofdetail with many inheren ..."
Abstract
 Add to MetaCart
(Show Context)
Summary. Multiresolution analysis has received considerable attention in recent years by researchers in the fields of computer graphics, geometric modeling and visualization. They are now considered a powerful tool for efficiently representing functions at multiple levelsofdetail with many inherent advantages, including compression, LevelOfDetails (LOD) display, progressive transmission and LOD editing. This survey chapter attempts to provide an overview of the recent results on the topic of multiresolution, with special focus on the work of researchers