Results 1 
5 of
5
Bayesian Quadratic Discriminant Analysis
 Journal of Machine Learning Research
, 2007
"... Quadratic discriminant analysis is a common tool for classification, but estimation of the Gaussian parameters can be illposed. This paper contains theoretical and algorithmic contributions to Bayesian estimation for quadratic discriminant analysis. A distributionbased Bayesian classifier is deriv ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
(Show Context)
Quadratic discriminant analysis is a common tool for classification, but estimation of the Gaussian parameters can be illposed. This paper contains theoretical and algorithmic contributions to Bayesian estimation for quadratic discriminant analysis. A distributionbased Bayesian classifier is derived using information geometry. Using a calculus of variations approach to define a functional Bregman divergence for distributions, it is shown that the Bayesian distributionbased classifier that minimizes the expected Bregman divergence of each class conditional distribution also minimizes the expected misclassification cost. A series approximation is used to relate regularized discriminant analysis to Bayesian discriminant analysis. A new Bayesian quadratic discriminant analysis classifier is proposed where the prior is defined using a coarse estimate of the covariance based on the training data; this classifier is termed BDA7. Results on benchmark data sets and simulations show that BDA7 performance is competitive with, and in some cases significantly better than, regularized quadratic discriminant analysis and the crossvalidated Bayesian quadratic discriminant analysis classifier Quadratic Bayes.
Distributionbased Bayesian minimum expected risk for discriminant analysis
 in Proc. IEEE Int. Symp. Inf. Theory
"... Abstract — This paper considers a distributionbased Bayesian estimation for classification by quadratic discriminant analysis, instead of the standard parameterbased Bayesian estimation. This approach also yields closed form solutions, but removes the parameterbased restriction of requiring more ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
Abstract — This paper considers a distributionbased Bayesian estimation for classification by quadratic discriminant analysis, instead of the standard parameterbased Bayesian estimation. This approach also yields closed form solutions, but removes the parameterbased restriction of requiring more training samples than feature dimensions. We investigate how to define a prior so that it has an adaptively regularizing effect: yielding robust estimation when the number of training samples are small compared to the number of feature dimensions, but converging as the number of data points grows large. Comparative performance on a suite of simulations shows that the distributionbased Bayesian discriminant analysis is advantageous in terms of average error. I.
Smart PCA
"... PCA can be smarter and makes more sensible projections. In this paper, we propose smart PCA, an extension to standard PCA to regularize and incorporate external knowledge into model estimation. Based on the probabilistic interpretation of PCA, the inverse Wishart distribution can be used as the info ..."
Abstract
 Add to MetaCart
PCA can be smarter and makes more sensible projections. In this paper, we propose smart PCA, an extension to standard PCA to regularize and incorporate external knowledge into model estimation. Based on the probabilistic interpretation of PCA, the inverse Wishart distribution can be used as the informative conjugate prior for the population covariance, and useful knowledge is carried by the prior hyperparameters. We design the hyperparameters to smoothly combine the information from both the domain knowledge and the data itself. The Bayesian point estimation of principal components is in closed form. In empirical studies, smart PCA shows clear improvement on three different criteria: image reconstruction errors, the perceptual quality of the reconstructed images, and the pattern recognition performance. 1
Monográfico: Métodos Bayesianos en Is Ciencias LOGISTIC DISCRIMINATION WITH MANY VARIABLES
"... Motivated by problems in near infrared spectroscopy, we study the discrimination problem with several groups and very many predictor variables. A Bayesian version of logistic regression with a ridgetype prior distribution on the coefficients is shown to give realistic group membership probabilitie ..."
Abstract
 Add to MetaCart
(Show Context)
Motivated by problems in near infrared spectroscopy, we study the discrimination problem with several groups and very many predictor variables. A Bayesian version of logistic regression with a ridgetype prior distribution on the coefficients is shown to give realistic group membership probabilities in a spectroscopic example. We compare two versions of these probabilities, one using plugin estimates of regression parameters and the other a Laplace approximation to the true predictive probabilities. RESUMEN Regresión logística con muchas variables Motivados por problemas de espectroscopia infraroja, estudiamos el problema de discriminación con varios grupos y muchas variables predictoras. Demostramos que una versión Bayesiana de la regresión logística, con una distribución inicial de tipo cresta (ridge), es capaz de proporcionar probabilidades realistas de clasificación en el ejemplo espectroscópico. Comparamos dos versiones de estas probabilidades, las obtenidas mediante sustitución de los parámetros por estimadores, y las obtenidas mediante una aproximación de Laplace a las verdaderas probabilidades predictivas. 1.