Results 1 
9 of
9
Bayesian Statistics
 in WWW', Computing Science and Statistics
, 1989
"... ∗ Signatures are on file in the Graduate School. This dissertation presents two topics from opposite disciplines: one is from a parametric realm and the other is based on nonparametric methods. The first topic is a jackknife maximum likelihood approach to statistical model selection and the second o ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
∗ Signatures are on file in the Graduate School. This dissertation presents two topics from opposite disciplines: one is from a parametric realm and the other is based on nonparametric methods. The first topic is a jackknife maximum likelihood approach to statistical model selection and the second one is a convex hull peeling depth approach to nonparametric massive multivariate data analysis. The second topic includes simulations and applications on massive astronomical data. First, we present a model selection criterion, minimizing the KullbackLeibler distance by using the jackknife method. Various model selection methods have been developed to choose a model of minimum KullbackLiebler distance to the true model, such as Akaike information criterion (AIC), Bayesian information criterion (BIC), Minimum description length (MDL), and Bootstrap information criterion. Likewise, the jackknife method chooses a model of minimum KullbackLeibler distance through bias reduction. This bias, which is inevitable in model
ON ICA OF COMPLEXVALUED FMRI: ADVANTAGES AND ORDER SELECTION
"... Functional magnetic resonance imaging (fMRI) data are originally acquired as complexvalued images, while virtually all fMRI studies only use the magnitude of the data in the analysis. Since little is known for devising models for the phase, independent component analysis (ICA) emerges as a promisin ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Functional magnetic resonance imaging (fMRI) data are originally acquired as complexvalued images, while virtually all fMRI studies only use the magnitude of the data in the analysis. Since little is known for devising models for the phase, independent component analysis (ICA) emerges as a promising technique for datadriven analysis of fMRI data in its native complex form. In this paper, we compare the performance of ICA on realvalued and complexvalued fMRI data and show the advantages of the complex approach. We also develop complexvalued order selection scheme to improve the estimation of the number of independent components in complexvalued fMRI data using informationtheoretic criteria. Comparisons on order selection using realvalued and complexvalued fMRI data demonstrate the more informative nature of complex data.
Noncircular principal component analysis and its application to model selection
 IEEE Trans. Signal Process
, 2011
"... Abstract—One of the most commonly used data analysis tools, principal component analysis (PCA), since is based on variance maximization, assumes a circular model, and hence cannot account for the potential noncircularity of complex data. In this paper, we introduce noncircular PCA (ncPCA), which ext ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract—One of the most commonly used data analysis tools, principal component analysis (PCA), since is based on variance maximization, assumes a circular model, and hence cannot account for the potential noncircularity of complex data. In this paper, we introduce noncircular PCA (ncPCA), which extends the traditional PCA to the case where there can be both circular and noncircular Gaussian signals in the subspace. We study the properties of ncPCA, introduce an efficient algorithm for its computation, and demonstrate its application to model selection, i.e., the detection of both the signal subspace order and the number of circular and noncircular signals. We present numerical results to demonstrate the advantages of ncPCA over regular PCA when there are noncircular signals in the subspace. At the same time, we note that since a noncircular model has more degrees of freedom than a circular one, there are cases where a circular model might be preferred even though the underlying problem is noncircular. In particular, we show that a circular model is preferred when the signaltonoise ratio (SNR) is low, number of samples is small, or the degree of noncircularity of the signals is low. Hence, ncPCA inherently provides guidance as to when to take noncircularity into account. Index Terms—Circularity, noncircularity, order selection, principal component analysis, propriety, signal subspace estimation. I.
Criteria for Linear Model Selection Based on Kullback's Symmetric Divergence
"... imulation study. Our results indicate that the new criteria perform favorably against their AIC analogues. Key words: AIC, Akaike information criterion, Idivergence, Jdivergence, KullbackLeibler information, regression, relative entropy. Acknowledgments. The author is indebted to the editor, ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
imulation study. Our results indicate that the new criteria perform favorably against their AIC analogues. Key words: AIC, Akaike information criterion, Idivergence, Jdivergence, KullbackLeibler information, regression, relative entropy. Acknowledgments. The author is indebted to the editor, the associate editor, and two referees whose thoughtful suggestions helped to improve the original version of this manuscript. This research was supported by the National Science Foundation, grant DMS9704436. 1. Introduction An important component of any linear modeling problem consists of determining an appropriate size and form for the design matrix. Improper specification may substantially impact both estimators of the model parameters and predictors of the response variable: underspecification may lead to results which are severely biased, whereas overspecification may lead to results with unnecessarily high variability. Model selection criteria, such as the Akaike (1973) informatio
Estimation of AR and ARMA models by
, 2007
"... Abstract: In this paper the stochastic complexity criterion is applied to estimation of the order in AR and ARMA models. The power of the criterion for short strings is illustrated by simulations. It requires an integral of the square root of Fisher information, which is done by Monte Carlo techniqu ..."
Abstract
 Add to MetaCart
Abstract: In this paper the stochastic complexity criterion is applied to estimation of the order in AR and ARMA models. The power of the criterion for short strings is illustrated by simulations. It requires an integral of the square root of Fisher information, which is done by Monte Carlo technique. The stochastic complexity, which is the negative logarithm of the Normalized Maximum Likelihood universal density function, is given. Also, exact asymptotic formulas for the Fisher information matrix are derived. 1.
FAST AND EFFECTIVE MODEL ORDER SELECTION METHOD TO DETERMINE THE NUMBER OF SOURCES IN A LINEAR TRANSFORMATION MODEL
"... This paper formally introduces the method named as RAE (ratio of adjacent eigenvalues) for model order selection, and proposes a new approach combining the recently developed SORTE (Second ORder sTatistic of the Eigenvalues) and RAE in the context for determining the number of sources in a linear tr ..."
Abstract
 Add to MetaCart
This paper formally introduces the method named as RAE (ratio of adjacent eigenvalues) for model order selection, and proposes a new approach combining the recently developed SORTE (Second ORder sTatistic of the Eigenvalues) and RAE in the context for determining the number of sources in a linear transformation model. The underlying rationale for the combination discovered through sufficient simulations is that SORTE overestimated the true order in the model and RAE underestimated the true order when the signal to noise ratio (SNR) was low. Simulations further showed that after the new method, called RAESORTE, was optimized, the true number of sources was almost correctly estimated even when the SNR was10 dB, which is extremely difficult for any other model order selection methods; moreover, RAE took much less time than SORTE known as computational efficiency. Hence, RAE and RAESORTE appear promising for the realtime and real world signal processing. Index Terms—Linear transformation model, model order selection, number of sources, ratio of adjacent eigenvalues, signal to noise ratio 1.
where i Ui[1; 256] and z0(i) U [010; 10]. The sensitivity matrix
"... H 0 is chosen as in the static case and the observation noise covariance is taken as Rk =0:01 2 I72272. The estimation performance of the CSKFp Algorithm in the dynamic case is presented in Fig. 3(a). This figure depicts the mean square estimation error based on N = 50 Monte Carlo runs for N = 100 ..."
Abstract
 Add to MetaCart
H 0 is chosen as in the static case and the observation noise covariance is taken as Rk =0:01 2 I72272. The estimation performance of the CSKFp Algorithm in the dynamic case is presented in Fig. 3(a). This figure depicts the mean square estimation error based on N = 50 Monte Carlo runs for N = 100 PM iterations. As can be seen, the best estimation performance in this case is attained by the CSKFp aided by the approximate l0 norm (16). Again, the CSKF1 exhibits the worst performance compared to the other filters. The attainable estimation errors in this case are slightly higher than in the static case. Nevertheless, it seems that the algorithms manage to adequately estimate the behavior of the nonzero processes as shown for the CSKF1 in Fig. 3(b). D. Application to Image Classification The CSKF method was applied to image classification in [13]. The classifier derived in [13] utilizes a static version of the CSKF1 (i.e.,
University Ibn Zohr
"... ABSTRACT. Recently, Azari et al (2006) showed that (AIC) criterion and its corrected versions cannot be directly applied to model selection for longitudinal data with correlated errors. They proposed two model selection criteria, AICc and RICc, by applying likelihood and residual likelihood approach ..."
Abstract
 Add to MetaCart
ABSTRACT. Recently, Azari et al (2006) showed that (AIC) criterion and its corrected versions cannot be directly applied to model selection for longitudinal data with correlated errors. They proposed two model selection criteria, AICc and RICc, by applying likelihood and residual likelihood approaches. These two criteria are estimators of the KullbackLeibler’s divergence distance which is asymmetric. In this work, we apply the likelihood and residual likelihood approaches to propose two new criteria, suitable for small samples longitudinal data, based on the Kullback’s symmetric divergence. Their performance relative to others criteria is examined in a large simulation study. RÉSUMÉ. Récemment, Azari et al. (2006) ont montré que le critère (AIC) ainsi que ses versions corrigées ne peuvent pas être directement appliqués aux données longitudinales avec des erreurs corrélées. Ils ont proposé deux critères, AICc et RICc, en utilisant la notion de vraisemblance et de vraisemblance résiduelle. Leurs critères sont des estimations de la distance asymétrique de divergence de KullbackLeibler. Dans ce travail, nous proposons deux nouveaux critères adaptés aux échantillons de petites tailles de données longitudinales en se basant sur la divergence symétrique de Kulback et les approches de maximum de vraisemblance et de vraisemblance résiduelle. Les performances de ces critères sont examinées dans une étude de simulations.