Results 1  10
of
18
Discretetime survival mixture analysisforsingle and recurrent events using latent variables. Unpublished doctoral dissertation
, 2003
"... This article proposes a general latent variable approach to discretetime survival analysis of nonrepeatable events such as onset of drug use. It is showvn how the survival analysis can beformulated as a generalized latent class analysis of event history indicators. The latent class analysis can use ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
This article proposes a general latent variable approach to discretetime survival analysis of nonrepeatable events such as onset of drug use. It is showvn how the survival analysis can beformulated as a generalized latent class analysis of event history indicators. The latent class analysis can use covariates and can be combined wvith the joint modeling of other outcomes such as repeated measures for a related process. It is shown that conventional discretetime survival analysis corresponds to a singleclass latent class analysis. Multipleclass extensions are proposed, including the special cases of a class of longterm survivors and classes defined by outcomes related to survivaL The estimation uses a general latent variable framework, including both categorical and continuous latent variables and incorporated in the Mplus program. Estimation is carried out using maximum likelihood via the EM algorithm. Two examples serve as illustrations. The first example concerns recidivism after incarceration in a randomized field experiment. The second example concerns school removal related to the development of aggressive behavior in the classroom.
A TEST OF SEVERAL PARAMETRIC STATISTICAL MODELS FOR ESTIMATING SUCCESS RATE IN THE TREATMENT OF CARCINOMA CERVIX UTERI
, 1975
"... Summary.The parametric statistical models discussed include all those which have previously been described in the literature (Boag, 1948lognormal; Berkson and Gage, 1952negative exponential; Haybittle, 1959extrapolated actuarial) and the basic data used to test the models comprised some 3000 cas ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Summary.The parametric statistical models discussed include all those which have previously been described in the literature (Boag, 1948lognormal; Berkson and Gage, 1952negative exponential; Haybittle, 1959extrapolated actuarial) and the basic data used to test the models comprised some 3000 case histories of patients treated between 1945 and 1962. The histories were followed up during the period 196971 and thus provided adequate information to validate longterm survival fractions predicted using shortterm followup data. The results with the lognormal model showed that for series of staged carcinoma cervix patients treated during a 5year period, satisfactory estimates of longterm survival fractions could be predicted after a minimum waiting period of 3 years for stages I and II, and 2 years for stage III. The model should be used with a value assumed for the lognormal paramater S in the range S = 035 to S = 0 40. Although alternative models often gave adequate predictions, the lognormal proved to be the most consistent model. This model may therefore now be used with more confidence for prospective studies on carcinoma cervix series and can provide good estimates of longterm
Statistical analysis of the bioassay of continuous carcinogens
 Brit. J. Cancer
, 1972
"... Summary.In an experiment consisting of the continuous constant application of various carcinogenic regimens to a pure strain of experimental animals for a long period, the cancer incidence rates so caused may be studied and compared by the fit of an appropriate class of statistical distributions. I ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Summary.In an experiment consisting of the continuous constant application of various carcinogenic regimens to a pure strain of experimental animals for a long period, the cancer incidence rates so caused may be studied and compared by the fit of an appropriate class of statistical distributions. In this paper we show that a Weibull distribution in which the agespecific cancer incidence rate rises as a power of time since first risk is more appropriate than a lognormal distribution. If the Weibull family of distributions is used, more information can be extracted from the data, and differences of toxicity between various regimens will not bias the comparison of their carcinogenic forces. CARCINOGENESIS produced by continued application of a carcinogen to mouse skin is becoming an increasingly common technique of assay of the carcinogenic forces of various substances. It has been pointed out (Pike and Roe, 1963) that a simple count of the number of cancers induced in a particular group is not a satisfactory measure of carcinogenic force, since cancers commonly occur late in life and a toxic treatment, although highly carcinogenic, may produce only a small number of cancers if it kills off a substantial fraction of the animals before the main cancersusceptible age range is reached. Allowance for the effects of intercurrent deaths on the numbers of cancers produced is therefore necessary before treatments can be compared. The method of Pike and Roe allows the unbiased estimation of the proportion of animals which would still be alive at a particular time if all causes of death other than cancer were eliminated, and the authors suggest that a plot of this estimated proportion against time gives the best description possible of the carcinogenic effects of a treatment. Although
A generalized F mixture model for cure rate estimation
 Statistics in Medicine
, 1998
"... Cure rate estimation is an important issue in clinical trials for diseases such as lymphoma and breast cancer and mixture models are the main statistical methods. In the last decade, mixture models under different distributions, such as exponential, Weibull, lognormal and Gompertz, have been discus ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Cure rate estimation is an important issue in clinical trials for diseases such as lymphoma and breast cancer and mixture models are the main statistical methods. In the last decade, mixture models under different distributions, such as exponential, Weibull, lognormal and Gompertz, have been discussed and used. However, these models involve stronger distributional assumptions than is desirable and inferences may not be robust to departures from these assumptions. In this paper, a mixture model is proposed using the generalized F distribution family. Although this family is seldom used because of computational difficulties, it has the advantage of being very flexible and including many commonly used distributions as special cases. The generalised F mixture model can relax the usual stronger distributional assumptions and allow the analyst to uncover structure in the data that might otherwise have been missed. This is illustrated by fitting the model to data from largescale clinical trials with long followup of lymphoma patients. Computational problems with the model and model selection methods are discussed. Comparison of maximum likelihood estimates with those obtained from mixture models under other distributions are included. � 1998 John Wiley & Sons, Ltd. 1.
Mixture Models in Econometric Duration Analysis
, 2002
"... Econometric duration analysis has become an important part of methodology in econometrics, bringing forth a plenty of applications. The probability distribution of the duration of a time span is modeled through its conditional hazard rate given the covariates. When some of the covariates are unobser ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Econometric duration analysis has become an important part of methodology in econometrics, bringing forth a plenty of applications. The probability distribution of the duration of a time span is modeled through its conditional hazard rate given the covariates. When some of the covariates are unobservable, the duration, given the observable covariates, has a mixture distribution. The paper surveys and discusses...
Assessing Placebo Response Using Bayesian Hierarchical Survival Models
, 1995
"... The National Institute of Mental Health (NIMH) Collaborative Study of LongTerm Maintenance Drug Therapy in Recurrent Affective Illness was a multicenter randomized controlled clinical trial designed to determine the efficacy of a pharmacotherapy for the prevention of the recurrence of unipolar affe ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The National Institute of Mental Health (NIMH) Collaborative Study of LongTerm Maintenance Drug Therapy in Recurrent Affective Illness was a multicenter randomized controlled clinical trial designed to determine the efficacy of a pharmacotherapy for the prevention of the recurrence of unipolar affective disorders. The outcome of interest in this study was the time until the recurrence of a depressive episode. The data show much heterogeneity between centers for the placebo group. The aim of this paper is to use Bayesian hierarchical survival models to investigate the heterogeneity of placebo effects among centers in the NIMH study. This heterogeneity is explored in terms of the marginal posterior distributions of parameters of interest and predictive distributions of future observations. The Gibbs sampling algorithm is used to approximate posterior and predictive distributions. Sensitivity of results to the assumption of a constant hazard survival distribution at the first stage of th...
Parametric Model Discrimination for Heavily Censored Survival Data
, 2008
"... Simultaneous discrimination among various parametric lifetime models is an important step in the parametric analysis of survival data. We consider a plot of the skewness versus the coefficient of variation for the purpose of discriminating among parametric survival models. We extend the method of C ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Simultaneous discrimination among various parametric lifetime models is an important step in the parametric analysis of survival data. We consider a plot of the skewness versus the coefficient of variation for the purpose of discriminating among parametric survival models. We extend the method of Cox & Oakes from complete to censored data by developing an algorithm based on a competing risks model and kernel function estimation. A byproduct of this algorithm is a nonparametric survival function estimate.