Results 1  10
of
11
Semiparametric binary regression models under shape constraints with an application to Indian schooling data
 JOURNAL OF ECONOMETRICS
, 2009
"... ..."
On ProjectionType Estimators of Multivariate Isotonic Functions
, 2011
"... Let M be an isotonic realvalued function on a compact subset of Rd and let M̂n be an unconstrained estimator of M. A feasible monotonizing technique is to take the largest (smallest) monotone function that lies below (above) the estimator M̂n or any convex combination of these two envelope estimato ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Let M be an isotonic realvalued function on a compact subset of Rd and let M̂n be an unconstrained estimator of M. A feasible monotonizing technique is to take the largest (smallest) monotone function that lies below (above) the estimator M̂n or any convex combination of these two envelope estimators. When the process rn(M̂n−M) is asymptotically equicontinuous for some sequence rn> 0, we show that these projected estimators are rnequivalent in probability to the original unrestricted estimator. Our first motivating application involves a monotone estimator of the conditional distribution function that has the distributional properties of the local linear regression estimator. Applications also include the estimation of econometric (probabilityweighted moment, quantilebased) and biometric (mean remaining lifetime) functions.
Degrees of freedom and model selection in semiparametric additive monotone regression
 Journal of Multivariate Analysis
, 2013
"... Abstract The degrees of freedom of semiparametric additive monotone models are derived using results about projections onto sums of order cones. Two important related questions are also studied, namely, the definition of estimators for the parameter of the error term and the formulation of specific ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract The degrees of freedom of semiparametric additive monotone models are derived using results about projections onto sums of order cones. Two important related questions are also studied, namely, the definition of estimators for the parameter of the error term and the formulation of specific Akaike Information Criteria statistics. Several alternatives are proposed to solve both problems and simulation experiments are conducted to compare the behavior of the different candidates. A new selection criterion is proposed that combines the ability to guess the model but also the efficiency to estimate the variance parameter. Finally, the criterion is used to select the model in a regression problem from a well known data set.
LASSO ISOtone for High Dimensional Additive Isotonic Regression
, 2010
"... Additive isotonic regression attempts to determine the relationship between a multidimensional observation variable and a response, under the constraint that the estimate is the additive sum of univariate component effects that are monotonically increasing. In this article, we present a new method ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Additive isotonic regression attempts to determine the relationship between a multidimensional observation variable and a response, under the constraint that the estimate is the additive sum of univariate component effects that are monotonically increasing. In this article, we present a new method for such regression called LASSO Isotone (LISO). LISO adapts ideas from sparse linear modelling to additive isotonic regression. Thus, it is viable in many situations with high dimensional predictor variables, where selection of significant versus insignificant variables are required. We suggest an algorithm involving a modification of the backfitting algorithm CPAV. We give a numerical convergence result, and finally examine some of its properties through simulations. We also suggest some possible extensions that improve performance, and allow calculation to be carried out when the direction of the monotonicity is unknown.
Empirical likelihood inferences for the semiparametric additive isotonic regression
, 2012
"... a b s t r a c t We consider the (profile) empirical likelihood inferences for the regression parameter (and its any subcomponent) in the semiparametric additive isotonic regression model where each additive nonparametric component is assumed to be a monotone function. In theory, we show that the e ..."
Abstract
 Add to MetaCart
(Show Context)
a b s t r a c t We consider the (profile) empirical likelihood inferences for the regression parameter (and its any subcomponent) in the semiparametric additive isotonic regression model where each additive nonparametric component is assumed to be a monotone function. In theory, we show that the empirical loglikelihood ratio for the regression parameters weakly converges to a standard chisquared distribution. In addition, our simulation studies demonstrate the empirical advantages of the proposed empirical likelihood method over the normal approximation method in Published by Elsevier Inc.
Ecole Doctorale: MATISSE présentée par
, 2013
"... Je tiens tout d’abord à remercier mes encadrants. Après avoir grandemment contribué à me faire venir enseigner à l’Université, Eric MatznerLøber m’a fait confiance pour ce travail de recherche; je tiens à lui témoigner toute ma reconnaissance et mes remerciements à son égard dépassent très largemen ..."
Abstract
 Add to MetaCart
Je tiens tout d’abord à remercier mes encadrants. Après avoir grandemment contribué à me faire venir enseigner à l’Université, Eric MatznerLøber m’a fait confiance pour ce travail de recherche; je tiens à lui témoigner toute ma reconnaissance et mes remerciements à son égard dépassent très largement le cadre professionnel. Arnaud Guyader a accepté sans hésitation de m’aider au moment même où les difficultés à venir s’annonçaient grandes. Je le remercie infiniment pour son écoute et sa très grande disponibilité; ce manuscrit lui doit beaucoup. Enfin, rendons à César ce qui appartient à César: l’idée de la méthode présentée ici est tout droit sortie de l’esprit prodigieux de Nicolas Hengartner. En m’invitant deux fois à Los Alamos ces dernières années, il m’a donné le privilège de travailler à ses côtés. Je suis désormais au moins sûr d’une chose: le génie et l’enthousiasme sont liés! Je tiens à remercier chaleureusement Cécile Durot et Sylvain Sardy pour l’intérêt qu’ils ont porté à mon travail en acceptant de rapporter sur cette thèse. Je suis également très reconnaissant à Christophe Abraham et Gérard Biau d’avoir bien voulu prendre sur leur temps précieux pour faire partie du jury. Je voudrais aussi remercier Marie de Tayrac pour m’avoir fourni les données médicales permettant
Iterative Isotonic Regression
"... This article introduces a new nonparametric method for estimating a univariate regression function of bounded variation. The method exploits the Jordan decomposition which states that a function of bounded variation can be decomposed as the sum of a nondecreasing function and a nonincreasing func ..."
Abstract
 Add to MetaCart
(Show Context)
This article introduces a new nonparametric method for estimating a univariate regression function of bounded variation. The method exploits the Jordan decomposition which states that a function of bounded variation can be decomposed as the sum of a nondecreasing function and a nonincreasing function. This suggests combining the backfitting algorithm for estimating additive functions with isotonic regression for estimating monotone functions. The resulting iterative algorithm is called Iterative Isotonic Regression (I.I.R.). The main technical result in this paper is the consistency of the proposed estimator when the number of iterations kn grows appropriately with the sample size n. The proof requires two auxiliary results that are of interest in and by themselves: firstly, we generalize the wellknown consistency property of isotonic regression to the framework of a nonmonotone regression function, and secondly, we relate the backfitting algorithm to Von Neumann’s algorithm in convex analysis. Index Terms — Nonparametric statistics, isotonic regression, additive models, metric projection onto convex cones.