Results 1 
9 of
9
Akaike’s information criterion and recent developments in information complexity
 Journal of Mathematical Psychology
"... criterion (AIC). Then, we present some recent developments on a new entropic or information complexity (ICOMP) criterion of Bozdogan (1988a, 1988b, 1990, 1994d, 1996, 1998a, 1998b) for model selection. A rationale for ICOMP as a model selection criterion is that it combines a badnessoffit term (su ..."
Abstract

Cited by 112 (9 self)
 Add to MetaCart
(Show Context)
criterion (AIC). Then, we present some recent developments on a new entropic or information complexity (ICOMP) criterion of Bozdogan (1988a, 1988b, 1990, 1994d, 1996, 1998a, 1998b) for model selection. A rationale for ICOMP as a model selection criterion is that it combines a badnessoffit term (such as minus twice the maximum log likelihood) with a measure of complexity of a model differently than AIC, or its variants, by taking into account the interdependencies of the parameter estimates as well as the dependencies of the model residuals. We operationalize the general form of ICOMP based on the quantification of the concept of overall model complexity in terms of the estimated inverseFisher information matrix. This approach results in an approximation to the sum of two KullbackLeibler distances. Using the correlational form of the complexity, we further provide yet another form of ICOMP to take into account the interdependencies (i.e., correlations) among the parameter estimates of the model. Later, we illustrate the practical utility and the importance of this new model selection criterion by providing several
Dynamic Bayesian Network and Nonparametric Regression for Nonlinear Modeling of Gene Networks from Time Series Gene Expression Data
 Biosystems
, 2003
"... Abstract. We propose a dynamic Bayesian network and nonparametric regression model for constructing a gene network from time series microarray gene expression data. The proposed method can overcome a shortcoming of the Bayesian network model in the sense of the construction of cyclic regulations. Th ..."
Abstract

Cited by 77 (12 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a dynamic Bayesian network and nonparametric regression model for constructing a gene network from time series microarray gene expression data. The proposed method can overcome a shortcoming of the Bayesian network model in the sense of the construction of cyclic regulations. The proposed method can analyze the microarray data as continuous data and can capture even nonlinear relations among genes. It can be expected that this model will give a deeper insight into the complicated biological systems. We also derive a new criterion for evaluating an estimated network from Bayes approach. We demonstrate the effectiveness of our method by analyzing Saccharomyces cerevisiae gene expression data. 1
Bayesian network and nonparametric heteroscedastic regression for nonlinear modeling of genetic network
 Proc. 1st IEEE Computer Society Bioinformatics Conference
, 2002
"... We propose a new statistical method for constructing a genetic network from microarray gene expression data by using a Bayesian network. An essential point of Bayesian network construction is in the estimation of the conditional distribution of each random variable. We consider fitting nonparametric ..."
Abstract

Cited by 48 (19 self)
 Add to MetaCart
(Show Context)
We propose a new statistical method for constructing a genetic network from microarray gene expression data by using a Bayesian network. An essential point of Bayesian network construction is in the estimation of the conditional distribution of each random variable. We consider fitting nonparametric regression models with heterogeneous error variances to the microarray gene expression data to capture the nonlinear structures between genes. A problem still remains to be solved in selecting an optimal graph, which gives the best representation of the system among genes. We theoretically derive a new graph selection criterion from Bayes approach in general situations. The proposed method includes previous methods based on Bayesian networks. We demonstrate the effectiveness of the proposed method through the analysis of Saccharomyces cerevisiae gene expression data newly obtained by disrupting 100 genes. 1.
Bayesian Statistics
 in WWW', Computing Science and Statistics
, 1989
"... ∗ Signatures are on file in the Graduate School. This dissertation presents two topics from opposite disciplines: one is from a parametric realm and the other is based on nonparametric methods. The first topic is a jackknife maximum likelihood approach to statistical model selection and the second o ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
(Show Context)
∗ Signatures are on file in the Graduate School. This dissertation presents two topics from opposite disciplines: one is from a parametric realm and the other is based on nonparametric methods. The first topic is a jackknife maximum likelihood approach to statistical model selection and the second one is a convex hull peeling depth approach to nonparametric massive multivariate data analysis. The second topic includes simulations and applications on massive astronomical data. First, we present a model selection criterion, minimizing the KullbackLeibler distance by using the jackknife method. Various model selection methods have been developed to choose a model of minimum KullbackLiebler distance to the true model, such as Akaike information criterion (AIC), Bayesian information criterion (BIC), Minimum description length (MDL), and Bootstrap information criterion. Likewise, the jackknife method chooses a model of minimum KullbackLeibler distance through bias reduction. This bias, which is inevitable in model
Bias Correction of CrossValidation Criterion Based on KullbackLeibler Information under a General Condition
"... This paper deals with the bias correction of the crossvalidation (CV) criterion for a choice of models. The bias correction is based on the predictive KullbackLeibler information, which measures the discrepancy between the distributions of an observation for a candidate model and the true model. B ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
This paper deals with the bias correction of the crossvalidation (CV) criterion for a choice of models. The bias correction is based on the predictive KullbackLeibler information, which measures the discrepancy between the distributions of an observation for a candidate model and the true model. By replacing an ordinary maximum likelihood estimator with an estimator obtained by maximizing a weighted loglikelihood function, a biascorrected CV criterion is proposed. This criterion always corrects the bias to O(n¡2) under a general condition. We verify that our criterion has smaller bias than the AIC, TIC, EIC and CV criteria by conducting numerical experiments.
VARIABLE SELECTION IN LOGISTIC DISCRIMINATION BASED ON LOCAL LIKELIHOOD
"... We consider the variable selection problem in the nonlinear discriminant procedure using local likelihood. The local likelihood method is an effective technique for analyzing data with complex structure, and various bandwidth selection methods have been suggested in recent years. Variable selection ..."
Abstract
 Add to MetaCart
We consider the variable selection problem in the nonlinear discriminant procedure using local likelihood. The local likelihood method is an effective technique for analyzing data with complex structure, and various bandwidth selection methods have been suggested in recent years. Variable selection in a nonlinear model, however, is more complex than bandwidth selection, since the optimal bandwidth depends on the combination of the variables. We propose a technique for variable selection using generalized information criteria in logistic discrimination based on local likelihood. We derive the logistic discrimination method with a sample covariance matrix to account for the correlation of the variables. Real data examples are given to examine the effectiveness of our technique. Key words and phrases: Discriminant analysis, local likelihood, model selection, prediction error estimation. 1.
Printed in Great Britain Bayesian information criteria and smoothing parameter
"... By extending Schwarz’s (1978) basic idea we derive a Bayesian information criterion which enables us to evaluate models estimated by the maximum penalised likelihood method or the method of regularisation. The proposed criterion is applied to the choice of smoothing parameters and the number of basi ..."
Abstract
 Add to MetaCart
By extending Schwarz’s (1978) basic idea we derive a Bayesian information criterion which enables us to evaluate models estimated by the maximum penalised likelihood method or the method of regularisation. The proposed criterion is applied to the choice of smoothing parameters and the number of basis functions in radial basis function network models. Monte Carlo experiments were conducted to examine the performance of the nonlinear modelling strategy of estimating the weight parameters by regularisation and then determining the adjusted parameters by the Bayesian information criterion. The simulation results show that our modelling procedure performs well in various situations.
APPROVAL
"... this work may be reproduced, without authorization, under the conditions for Fair Dealing. Therefore, limited reproduction of this work for the purposes of private study, research, criticism, review and news reporting is likely to be in accordance with the law, particuarly if cited appropriately. Na ..."
Abstract
 Add to MetaCart
(Show Context)
this work may be reproduced, without authorization, under the conditions for Fair Dealing. Therefore, limited reproduction of this work for the purposes of private study, research, criticism, review and news reporting is likely to be in accordance with the law, particuarly if cited appropriately. Name: Degree:
Model Selection Criteria in Generalized Linear Models
"... 2. 公表論文 (1) Consistent selection of working correlation structure in GEE analysis based on Stein's loss function, S. Imori, Hiroshima Mathematical Journal (2014), to appear. ..."
Abstract
 Add to MetaCart
(Show Context)
2. 公表論文 (1) Consistent selection of working correlation structure in GEE analysis based on Stein's loss function, S. Imori, Hiroshima Mathematical Journal (2014), to appear.