Results 11  20
of
97
Life, Death and Lawfulness on the Electronic Frontier
, 1997
"... To facilitate users' ability to make sense of large collections of hypertext we present two new techniques for inducing clusters of related documents on the World Wide Web. Users' ability to find relevant information might also be enhanced by finding lawful properties of document behavior and use. W ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
To facilitate users' ability to make sense of large collections of hypertext we present two new techniques for inducing clusters of related documents on the World Wide Web. Users' ability to find relevant information might also be enhanced by finding lawful properties of document behavior and use. We present models and analyses of document use and change for the World Wide Web. Keywords Clustering, categorization, cocitation analysis, World Wide Web, hypertext, survival analysis, usage models INTRODUCTION The everincreasing universe of electronic information competes for the effectively fixed and limited attention of people. Both consumers and producers of information want to understand what kinds of information are out there, how desirable it is, and how its content and use change through time. Our work aims to discover empirical regularities of hypertext content, use, and structure, and ways of exploiting these regularities to provide new ways of helping people to find and make...
Local Likelihood and Local Partial Likelihood in Hazard Regression
 Ann. Statist
, 1996
"... In survival analysis, the relationship between a survival time and a covariate is conveniently modeled with the proportional hazards regression model. This model usually assumes that the covariate has a loglinear eect on the hazard function. In this paper we consider the proportional hazards regres ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
In survival analysis, the relationship between a survival time and a covariate is conveniently modeled with the proportional hazards regression model. This model usually assumes that the covariate has a loglinear eect on the hazard function. In this paper we consider the proportional hazards regression model with a nonparametric risk eect. We discuss estimation of the risk function and its derivatives in two cases: when the baseline hazard function is parametrized and when it is not parametrized. In the case of a parametric baseline hazard function, inference is based on a local version of the likelihood function, while in the case of a nonparametric baseline hazard, we use a local version of the partial likelihood. This results in maximum local likelihood estimators and maximum local partial likelihood estimators respectively. We establish the asymptotic normality of the estimators. It turns out that both methods have the same asymptotic bias and variance in a common situation, eve...
Efficient estimation of the partly linear additive Cox
, 1999
"... Abstract: The partly linear additive Cox model is an extention of the (linear) Cox model and allows flexible modeling of covariate effects semiparametrically. We study asymptotic properties of the maximum partial likelihood estimator of this model with rightcensored data using polynomial splines. W ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
Abstract: The partly linear additive Cox model is an extention of the (linear) Cox model and allows flexible modeling of covariate effects semiparametrically. We study asymptotic properties of the maximum partial likelihood estimator of this model with rightcensored data using polynomial splines. We show that, with a range of choices of the smoothing parameter (the number of spline basis functions) required for estimation of the nonparametric components, the estimator of the finitedimensional regression parameter is rootn consistent, asymptotically normal and achieves the semiparametric information bound. Rates of convergence for the estimators of the nonparametric components are obtained, they are comparable to the rates in nonparametric regression. Implementation of the estimation approach can be done easily and is illustrated by using a simulated example. Abbreviated title: Partly linear additive Cox model
Bayesian information criterion for censored survival models
 Biometrics
"... We investigate the Bayesian Information Criterion (BIC) for variable selection in models for censored survival data. Kass and Wasserman (1995) showed that BIC provides a close approximation to the Bayes factor when a unitinformation prior on the parameter space is used. We propose a revision of the ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
We investigate the Bayesian Information Criterion (BIC) for variable selection in models for censored survival data. Kass and Wasserman (1995) showed that BIC provides a close approximation to the Bayes factor when a unitinformation prior on the parameter space is used. We propose a revision of the penalty term in BIC so that it is de ned in terms of the number of uncensored events instead of the number of observations. For the simplest censored data model, that of exponential distributions of survival times (i.e. a constant hazard rate), this revision results in a better approximation to the exact Bayes factor based on a conjugate unitinformation prior. In the Cox proportional hazards regression model, we propose de ning BIC in terms of the maximized partial likelihood. Using the number of deaths rather than the number of individuals in the BIC penalty term corresponds to a more realistic prior on the parameter space, and is shown to improve predictive performance for assessing stroke risk in the Cardiovascular Health Study.
RANDOM SURVIVAL FORESTS
, 2008
"... We introduce random survival forests, a random forests method for the analysis of rightcensored survival data. New survival splitting rules for growing survival trees are introduced, as is a new missing data algorithm for imputing missing data. A conservationofevents principle for survival forest ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
We introduce random survival forests, a random forests method for the analysis of rightcensored survival data. New survival splitting rules for growing survival trees are introduced, as is a new missing data algorithm for imputing missing data. A conservationofevents principle for survival forests is introduced and used to define ensemble mortality, a simple interpretable measure of mortality that can be used as a predicted outcome. Several illustrative examples are given, including a case study of the prognostic implications of body mass for individuals with coronary artery disease. Computations for all examples were implemented using the freely available Rsoftware package, randomSurvivalForest.
Penalized likelihood regression: General formulation and efficient approximation
, 2002
"... The authors consider a general formulation of penalized likelihood regression, which covers canonical and noncanonical links for exponential families as well as accelerated life models with censored survival data. They present an asymptotic analysis of convergence rates to justify a simple approach ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
The authors consider a general formulation of penalized likelihood regression, which covers canonical and noncanonical links for exponential families as well as accelerated life models with censored survival data. They present an asymptotic analysis of convergence rates to justify a simple approach to the lowerdimensional approximation of the estimates. The lowerdimensional approximation allows for much faster numerical calculation, paving the way to the development of algorithms that scale well with large data sets.
Rate of Convergence for Hazard Regression
 Scand. J. Statist
, 1994
"... The logarithm of the conditional hazard function of a survival time given one or more covariates is approximated by a function having the form of a specified sum of functions of at most d of the variables. Subject to this form, the approximation is chosen to maximize the expected conditional logli ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
The logarithm of the conditional hazard function of a survival time given one or more covariates is approximated by a function having the form of a specified sum of functions of at most d of the variables. Subject to this form, the approximation is chosen to maximize the expected conditional loglikelihood. Maximum likelihood and sums of tensor products of polynomial splines are used to construct an estimate of this approximation based on a random sample. The components of this estimate possess a rate of convergence that depends only on d and a suitably defined smoothness parameter. KEY WORDS: Conditional hazard function; Maximum likelihood; Tensor product splines. This research was supported in part by a grant from the Graduate School Fund of the University of Washington. y This research was supported in part by National Science Foundation Grant DMS9204247 z This research was supported in part by a Research Council Grant from the University of North Carolina. 1 1. Introducti...
Regularized estimation in the accelerated failure time model with high dimensional covariates. Biometrics
 Biometrics
, 2005
"... Summary The need for analyzing failure time data with highdimensional covariates arises in investigating the relationship between a censored survival outcome and microarray gene expression profiles. We consider two regularization approaches, the LASSO and the threshold gradient directed regularizat ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Summary The need for analyzing failure time data with highdimensional covariates arises in investigating the relationship between a censored survival outcome and microarray gene expression profiles. We consider two regularization approaches, the LASSO and the threshold gradient directed regularization, for variable selection and estimation in the accelerated failure time model with highdimensional covariates based on Stute’s weighted least squares method. The Stute estimator uses the KaplanMeier weights to account for censoring in the least squares criterion. The weighted least squares objective function makes the adaption of this approach to high dimensional covariate settings computationally feasible. We use the Vfold cross validation and a modified Akaike’s Information Criterion for tuning parameter selection, and a bootstrap approach for variance estimation. The proposed method is evaluated using simulations and demonstrated with a real data example.
FUNCTIONAL ANOVA MODELING FOR PROPORTIONAL HAZARDS REGRESSION
"... The logarithm of the relative risk function in a proportional hazards model involving one or more possibly timedependent covariates is treated as a specified sum of a constant term, main effects, and selected interaction terms. Maximum partial likelihood estimation is used, where the maximization i ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
The logarithm of the relative risk function in a proportional hazards model involving one or more possibly timedependent covariates is treated as a specified sum of a constant term, main effects, and selected interaction terms. Maximum partial likelihood estimation is used, where the maximization is taken over a suitably chosen finitedimensional estimation space, whose dimension increases with the sample size and which is constructed from linear spaces of functions of one covariate and their tensor products. The L 2 rate of convergence for the estimate and its ANOVA components is obtained. An adaptive numerical implementation is discussed, whose performance is compared to (full likelihood) hazard regression both with and without the restriction to proportional hazards.