Results 1  10
of
182
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
Normalization of cDNA microarray data
 Methods
, 2003
"... Normalization means to adjust microarray data for effects which arise from variation in the technology rather than from biological differences between the RNA samples or between the printed probes. This article describes normalization methods based on the fact that dye balance typically varies with ..."
Abstract

Cited by 241 (8 self)
 Add to MetaCart
(Show Context)
Normalization means to adjust microarray data for effects which arise from variation in the technology rather than from biological differences between the RNA samples or between the printed probes. This article describes normalization methods based on the fact that dye balance typically varies with spot intensity and with spatial position on the array. Printtip loess normalization provides a welltested general purpose normalization method which has given good results on a wide range of arrays. The method may be refined by using quality weights for individual spots. The method is best combined with diagnostic plots of the data which display the spatial and intensity trends. When diagnostic plots show that biases still remain in the data after normalization, further normalization steps such as plateorder normalization or scalenormalization between the arrays may be undertaken. Composite normalization may be used when control spots are available which are known to be not differentially expressed. Variations on loess normalization include global loess normalization and 2D normalization. Detailed commands are given to implement the normalization techniques using freely available software. 1
New Insights Into Smile, Mispricing and Value At Risk: The Hyperbolic Model
 Journal of Business
, 1998
"... We investigate a new basic model for asset pricing, the hyperbolic model, which allows an almost perfect statistical fit of stock return data. After a brief introduction into the theory supported by an appendix we use also secondary market data to compare the hyperbolic model to the classical Black ..."
Abstract

Cited by 139 (7 self)
 Add to MetaCart
We investigate a new basic model for asset pricing, the hyperbolic model, which allows an almost perfect statistical fit of stock return data. After a brief introduction into the theory supported by an appendix we use also secondary market data to compare the hyperbolic model to the classical BlackScholes model. We study implicit volatilities, the smile effect and the pricing performance. Exploiting the full power of the hyperbolic model, we construct an option value process from a statistical point of view by estimating the implicit riskneutral density function from option data. Finally we present some new valueat risk calculations leading to new perspectives to cope with model risk. I Introduction There is little doubt that the BlackScholes model has become the standard in the finance industry and is applied on a large scale in everyday trading operations. On the other side its deficiencies have become a standard topic in research. Given the vast literature where refinements a...
Generalized Likelihood Ratio Statistics And Wilks Phenomenon
, 2000
"... this paper. We introduce the generalized likelihood statistics to overcome the drawbacks of nonparametric maximum likelihood ratio statistics. New Wilks phenomenon is unveiled. We demonstrate that a class of the generalized likelihood statistics based on some appropriate nonparametric estimators are ..."
Abstract

Cited by 138 (25 self)
 Add to MetaCart
this paper. We introduce the generalized likelihood statistics to overcome the drawbacks of nonparametric maximum likelihood ratio statistics. New Wilks phenomenon is unveiled. We demonstrate that a class of the generalized likelihood statistics based on some appropriate nonparametric estimators are asymptotically distribution free and follow
A new nonlinear normalization method for reducing variability in DNA microarray experiments
, 2002
"... Background: Microarray data are subject to multiple sources of variation, of which biological sources are of interest whereas most others are only confounding. Recent work has identified systematic sources of variation that are intensitydependent and nonlinear in nature. Systematic sources of v ..."
Abstract

Cited by 116 (4 self)
 Add to MetaCart
Background: Microarray data are subject to multiple sources of variation, of which biological sources are of interest whereas most others are only confounding. Recent work has identified systematic sources of variation that are intensitydependent and nonlinear in nature. Systematic sources of variation are not limited to the differing properties of the cyanine dyes Cy5 and Cy3 as observed in cDNA arrays, but are the general case for both oligonucleotide microarray (Affymetrix GeneChips ) and cDNA microarray data. Current normalization techniques are most often linear and therefore not capable of fully correcting for these effects.
Covariance tapering for interpolation of large spatial datasets
 Journal of Computational and Graphical Statistics
, 2006
"... Interpolation of a spatially correlated random process is used in many areas. The best unbiased linear predictor, often called kriging predictor in geostatistical science, requires the solution of a large linear system based on the covariance matrix of the observations. In this article, we show that ..."
Abstract

Cited by 97 (9 self)
 Add to MetaCart
(Show Context)
Interpolation of a spatially correlated random process is used in many areas. The best unbiased linear predictor, often called kriging predictor in geostatistical science, requires the solution of a large linear system based on the covariance matrix of the observations. In this article, we show that tapering the correct covariance matrix with an appropriate compactly supported covariance function reduces the computational burden significantly and still has an asymptotic optimal mean squared error. The effect of tapering is to create a sparse approximate linear system that can then be solved using sparse matrix algorithms. Extensive Monte Carlo simulations support the theoretical results. An application to a large climatological precipitation dataset is presented as a concrete practical illustration.
Functionalcoefficient Regression Models for Nonlinear Time Series
 Journal of the American Statistical Association
, 1998
"... We apply the local linear regression technique for estimation of functionalcoefficient regression models for time series data. The models include threshold autoregressive models (Tong 1990) and functionalcoefficient autoregressive models (Chen and Tsay 1993) as special cases but with the added adv ..."
Abstract

Cited by 81 (15 self)
 Add to MetaCart
We apply the local linear regression technique for estimation of functionalcoefficient regression models for time series data. The models include threshold autoregressive models (Tong 1990) and functionalcoefficient autoregressive models (Chen and Tsay 1993) as special cases but with the added advantages such as depicting finer structure of the underlying dynamics and better postsample forecasting performance. We have also proposed a new bootstrap test for the goodness of fit of models and a bandwidth selector based on newly defined crossvalidatory estimation for the expected forecasting errors. The proposed methodology is dataanalytic and is of appreciable flexibility to analyze complex and multivariate nonlinear structures without suffering from the "curse of dimensionality". The asymptotic properties of the proposed estimators are investigated under the ffmixing condition. Both simulated and real data examples are used for illustration. Key Words: ffmixing; Asymptotic normali...
TwoStep Estimation of Functional Linear Models with Applications to Longitudinal Data
 Journal of the Royal Statistical Society, Series B
, 2000
"... Functional linear models are useful in longitudinal data analysis. They include many classical and recently proposed statistical models for longitudinal data and other functional data. Recently, smoothing spline and kernel methods have been proposed for estimating their coefficient functions nonpara ..."
Abstract

Cited by 71 (5 self)
 Add to MetaCart
Functional linear models are useful in longitudinal data analysis. They include many classical and recently proposed statistical models for longitudinal data and other functional data. Recently, smoothing spline and kernel methods have been proposed for estimating their coefficient functions nonparametrically but these methods are either intensive in computation or inefficient in performance. Toovercome these drawbacks, in this paper, a simple and powerful twostep alternativeis proposed. In particular, the implementation of the proposed approach via local polynomial smoothing is discussed. Methods for estimating standard deviations of estimated coefficient functions are also proposed. Some asymptotic results for the local polynomial estimators are established. Two longitudinal data sets, one of which involves timedependent covariates, are used to demonstrate the proposed approach. Simulation studies show that our twostep approach improves the kernel method proposed in Hoover, et al...
Efficient Estimation and Inferences for VaryingCoefficient Models
 Journal of the American Statistical Association
, 1999
"... This paper deals with statistical inferences based on the varyingcoefficient models proposed by Hastie and Tibshirani (1993). Local polynomial regression techniques are used to estimate coefficient functions and the asymptotic normality of the resulting estimators is established. The standard error ..."
Abstract

Cited by 61 (19 self)
 Add to MetaCart
This paper deals with statistical inferences based on the varyingcoefficient models proposed by Hastie and Tibshirani (1993). Local polynomial regression techniques are used to estimate coefficient functions and the asymptotic normality of the resulting estimators is established. The standard error formulas for estimated coefficients are derived and are empirically tested. A goodnessoffit test technique, based on a nonparametric maximum likelihood ratio type of test, is also proposed to detect whether certain coefficient functions in a varyingcoefficient model are constant or whether any covariates are statistically significant in the model. The null distribution of the test is estimated by a conditional bootstrap method. Our estimation techniques involve solving hundreds of local likelihood equations. To reduce computational burden, a onestep NewtonRaphson estimator is proposed and implemented. We show that the resulting onestep procedure can save computational cost in an order ...