Results 1  10
of
18
Nonparametric Kernel Regression Subject To Monotonicity Constraints
 Annals of Statistics
, 1999
"... . We suggest a biasedbootstrap method for monotonising general linear, kerneltype estimators, for example local linear estimators and NadarayaWatson estimators. Attributes of our approach include the fact that it produces smooth estimates, that is applicable to a particularly wide range of estimat ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
. We suggest a biasedbootstrap method for monotonising general linear, kerneltype estimators, for example local linear estimators and NadarayaWatson estimators. Attributes of our approach include the fact that it produces smooth estimates, that is applicable to a particularly wide range of estimator types, and that it can be employed after the smoothing step has been implemented. Therefore, an experimenter may use his or her favourite kernel estimator, and their favourite bandwidth selector, to construct the basic nonparametric smoother, and then use our technique to render it monotone in a smooth way. Since our method is based on maximising fidelity to the conventional empirical approach, subject to monotonicity, then if the original kernel smoother were monotone we would not modify it. More generally, we would adjust it by adjoining weights to data values so as to make least possible change, in the sense of a distance measure, subject to imposing the constraint of monotonicity. KEY...
Comparing the Shapes of Regression Functions
, 2000
"... Introduction The shape of a function is often of key interest in regression analysis. For instance, local maxima, or `bumps', are scientifically important features in the shape of a re1 gression function. In evolutionary biology, an individual's chance of survival typically depends on the value of ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Introduction The shape of a function is often of key interest in regression analysis. For instance, local maxima, or `bumps', are scientifically important features in the shape of a re1 gression function. In evolutionary biology, an individual's chance of survival typically depends on the value of a physical trait. If the chance of survival attains a global maximum, then the trait will evolve toward the optimal value. If there are two local maxima, then the trait will evolve differently, with two commonly occurring values emerging. Thus the shape of the curve determines the type of evolution. The presence of bumps can raise scientific questions. Growth curves of U.S. children clearly show two local maxima in the rate of growth. In Swiss children, however, these growth spurts seem to be either missing or less prominent; see Ramsay, Bock & Gasser (1995). Are the Swiss and U.S. growth rates similar in shape? If not, how and why do their shapes differ? Can we group countri
Nonparametric state price density estimation using constrained least squares and the bootstrap, Journal of Econometrics, in print. 36 649 Discussion Paper Series For a complete list of Discussion Papers published by the SFB 649, please visit http://sfb649
, 2005
"... The economic theory of option pricing imposes constraints on the structure of call functions and state price densities. Except in a few polar cases, it does not prescribe functional forms. This paper proposes a nonparametric estimator of option pricing models which incorporates various restrictions ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
The economic theory of option pricing imposes constraints on the structure of call functions and state price densities. Except in a few polar cases, it does not prescribe functional forms. This paper proposes a nonparametric estimator of option pricing models which incorporates various restrictions within a single least squares procedure thus permitting investigation of a wide variety of model specifications and constraints. Among these we consider monotonicity and convexity of the call function and integration to one of the state price density. The procedure easily accommodates heteroskedasticity of the residuals. The bootstrap is used to produce confidence intervals for the call function and its first two derivatives. We apply the techniques to option pricing data on the DAX. Keywords: option pricing, state price density estimation, nonparametric least squares, bootstrap inference, monotonicity, convexity
Testing Monotonicity Of Regression
, 1998
"... this article, we study this problem and construct asymptotically valid tests. Our test statistics are suitable functionals of a stochastic process which may be viewed as a local version of Kendall's tau statistic and have simple natural interpretations. The process involved is a degreetwo Uprocess, ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
this article, we study this problem and construct asymptotically valid tests. Our test statistics are suitable functionals of a stochastic process which may be viewed as a local version of Kendall's tau statistic and have simple natural interpretations. The process involved is a degreetwo Uprocess, as in Nolan and Pollard (1987). The asymptotic behaviour of the test statistics are studied in three major steps: Approximation of the Uprocess by the empirical process defined by the H'ajek projection, strong approximation of the empirical process by a Gaussian process and finally the extreme value theory for stationary Gaussian processes. The paper is organized as follows. In Section 2, we introduce two different types of test statistics. We also formally describe the model and the hypothesis and explain the notation and regularity conditions in this section. In Section 3, we investigate the asymptotic behaviour of the Uprocess and establish the Gaussian process approximation. Section 4 is devoted to the study of the limiting distribution of the first test statistics using the extreme value theory for stationary Gaussian processes and the results of Section 3. In Section 5, we show that this test is consistent against all alternatives and also determine the minimal rate so that alternatives further apart than this rate can be effectively tested. The second test statistic is studied in Section 6. Technical proofs are presented in Section 7 and the appendix. 2. The Test Statistics
Testing For Monotonicity Of A Regression Mean Without Selecting A Bandwidth
, 1998
"... . A new approach to testing for monotonicity of a regression mean, not requiring computation of a curve estimator or a bandwidth, is suggested. It is based on the notion of `running gradients' over short intervals, although from some viewpoints it may be regarded as an analogue for monotonicity test ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
. A new approach to testing for monotonicity of a regression mean, not requiring computation of a curve estimator or a bandwidth, is suggested. It is based on the notion of `running gradients' over short intervals, although from some viewpoints it may be regarded as an analogue for monotonicity testing of the dip/excess mass approach for testing modality hypotheses about densities. Like the latter methods, the new technique does not suffer difficulties caused by almostflat parts of the target function. In fact, it is calibrated so as to work well for flat response curves, and as a result it has relatively good power properties in boundary cases where the curve exhibits shoulders. In this respect, as well as in its construction, the `running gradients' approach differs from alternative techniques based on the notion of a critical bandwidth. KEYWORDS. Bootstrap, calibration, curve estimation, Monte Carlo, response curve, running gradient. SHORT TITLE. Testing for monotonicity. 1 The man...
Testing of Monotonicity in Regression Models
 Mimeograph Series, Operations Research, Statistics
, 1990
"... In data anaysis concerning the investigation of the relationship between a dependent variable Y and an independent variable X, one may wish to determine whether this relationship is monotone or not. This determination may be of interest in itself, or it may form part of a (nonparametric) regression ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In data anaysis concerning the investigation of the relationship between a dependent variable Y and an independent variable X, one may wish to determine whether this relationship is monotone or not. This determination may be of interest in itself, or it may form part of a (nonparametric) regression analysis which relies on monotonicity of the true regression function. In this paper we generalize the test of positive correlation by proposing a test statistic for monotonicity based on fitting a parametric model, say a higher order polynomial, to the data with and without the monotonicity constraint. The statistic has an asymptotic chibarsquared distribution under the null hypothesis that the true regression function is on the boundary of the space of monotone functions. Based on the theoretical results, an algorithm is developed for testing the significance of the statistic, and it is shown to perform well in several null and nonnull settings. Extensions to fitting regression splines ...
CriSP  a Tool for Bump Hunting
, 1999
"... We propose a test of multimodality of regression functions and their derivatives. The test statistic is a critical smoothing parameter (CriSP), giving the minimum amount of smoothing necessary to force the regression function to satisfy the null hypothesis. The pvalues are computed via bootstrappin ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We propose a test of multimodality of regression functions and their derivatives. The test statistic is a critical smoothing parameter (CriSP), giving the minimum amount of smoothing necessary to force the regression function to satisfy the null hypothesis. The pvalues are computed via bootstrapping. Our idea is motivated by Silverman's test concerning the number of modes in the density function. Simulation studies indicate that the test works well, even when testing for bumps in the derivative. We apply CriSP to children's growth data, to study the number of spurts of growth.
Testing and Estimating ShapeConstrained Nonparametric Density and Regression in the Presence of Measurement Error 1
"... In many applications we can expect that, or are interested to know if, a density function or a regression curve satisfies some specific shape constraints. For example, when the explanatory variable, X, represents the value taken by a treatment or dosage, the conditional mean of the response, Y, is o ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
In many applications we can expect that, or are interested to know if, a density function or a regression curve satisfies some specific shape constraints. For example, when the explanatory variable, X, represents the value taken by a treatment or dosage, the conditional mean of the response, Y, is often anticipated to be a monotone function of X. Indeed, if this regression mean is not monotone (in the appropriate direction) then the medical or commercial value of the treatment is likely to be significantly curtailed, at least for values of X that lie beyond the point at which monotonicity fails. In the case of a density, common shape constraints include logconcavity and unimodality. If we can correctly guess the shape of a curve, then nonparametric estimators can be improved by taking this information into account. Addressing such problems requires a method for testing the hypothesis that the curve of interest satisfies a shape constraint, and, if the conclusion of the test is positive, a technique for estimating the curve subject to the constraint. Nonparametric methodology for solving these problems already exists, but only in cases where the covariates are observed precisely. However in many problems, data can only be observed with measurement errors,
LASSO ISOtone for High Dimensional Additive Isotonic Regression
, 2010
"... Additive isotonic regression attempts to determine the relationship between a multidimensional observation variable and a response, under the constraint that the estimate is the additive sum of univariate component effects that are monotonically increasing. In this article, we present a new method ..."
Abstract
 Add to MetaCart
Additive isotonic regression attempts to determine the relationship between a multidimensional observation variable and a response, under the constraint that the estimate is the additive sum of univariate component effects that are monotonically increasing. In this article, we present a new method for such regression called LASSO Isotone (LISO). LISO adapts ideas from sparse linear modelling to additive isotonic regression. Thus, it is viable in many situations with high dimensional predictor variables, where selection of significant versus insignificant variables are required. We suggest an algorithm involving a modification of the backfitting algorithm CPAV. We give a numerical convergence result, and finally examine some of its properties through simulations. We also suggest some possible extensions that improve performance, and allow calculation to be carried out when the direction of the monotonicity is unknown.