## Beyond Eigenfaces: Probabilistic Matching for Face Recognition (1998)

Citations: | 107 - 2 self |

### BibTeX

@INPROCEEDINGS{Moghaddam98beyondeigenfaces:,

author = {Baback Moghaddam and Wasiuddin Wahid and Alex Pentland},

title = {Beyond Eigenfaces: Probabilistic Matching for Face Recognition},

booktitle = {},

year = {1998},

pages = {30--35}

}

### Years of Citing Articles

### OpenURL

### Abstract

We propose a novel technique for direct visual matching of images for the purposes of face recognition and database search. Specifically, we argue in favor of a probabilistic measure of similarity, in contrast to simpler methods which are based on standard L2 norms (e.g., template matching) or subspace-restricted norms (e.g., eigenspace matching). The proposed similarity measure is based on a Bayesian analysis of image differences: we model two mutually exclusive classes of variation between two facial images: intra-personal (variations in appearance of the same individual, due to different expressions or lighting) and extra-personal (variations in appearance due to a difference in identity). The high-dimensional probability density functions for each respective class are then obtained from training data using an eigenspace density estimation technique and subsequently used to compute a similarity measure based on the a posteriori probability of membership in the intrapersonal class,...

### Citations

2911 | Eigenfaces for recognition
- Turk, Pentland
- 1991
(Show Context)
Citation Context ...ents were geometrically aligned and normalized in this manner prior to further analysis. 3.1 Eigenface Matching As a baseline comparison, we first used an eigenface matching technique for recognition =-=[9]-=-. The normalized images from the gallery and the probe sets were projected onto a 100-dimensional eigenspace and a nearest-neighbor rule based on a Euclidean distance measure was used to match each pr... |

2155 |
2002: Principal Component Analysis
- Jolliffe
(Show Context)
Citation Context ...proposed by Moghaddam & Pentland [6] which divides the vector space R N into two complementary subspaces using an eigenspace decomposition. This method relies on a Principal Components Analysis (PCA) =-=[4]-=- to form a lowdimensional estimate of the complete likelihood which can be evaluated using only the first M principal components, where M !! N . This decomposition is illustrated in Figure 1 which sho... |

1573 | Eigenfaces vs. Fisherfaces: recognition using class specific linear projection
- Belhumeur, Hespanha, et al.
- 1997
(Show Context)
Citation Context ...ty estimation of highdimensional data [6]. This Bayesian (MAP) approach can also be viewed as a generalized nonlinear extension of Linear Discriminant Analysis (LDA) [8, 3] or "FisherFace" t=-=echniques [1]-=- for face recognition. Moreover, our nonlinear generalization has distinct computational/storage advantages over these linear methods for large databases. 2 Analysis of Intensity Differences We now co... |

776 | The FERET evaluation methodology for face-recognition algorithms
- Phillips, Moon, et al.
(Show Context)
Citation Context ... the September 1996 FERET competition (with subspace dimensionalities of MI = ME = 125) and was found to be the top-performing system by a typical margin of 10-20% over the other competing algorithms =-=[7]-=- (see Figure 7). Figure 8 shows the performance comparison between standard eigenfaces and the Bayesian method from this test. Note the 10% gain in performance afforded by the new Bayesian similarity ... |

579 | Probabilistic visual learning for object recognition
- Moghaddam, Pentland
- 1997
(Show Context)
Citation Context ... using estimates of the likelihoods P (\Deltaj\Omega I ) and P (\Deltaj\Omega E) which are derived from training data using an efficient subspace method for density estimation of highdimensional data =-=[6]. This Bay-=-esian (MAP) approach can also be viewed as a generalized nonlinear extension of Linear Discriminant Analysis (LDA) [8, 3] or "FisherFace" techniques [1] for face recognition. Moreover, our n... |

407 | Using discriminant eigenfeatures for image retrieval
- Swets, Weng
- 1996
(Show Context)
Citation Context ...fficient subspace method for density estimation of highdimensional data [6]. This Bayesian (MAP) approach can also be viewed as a generalized nonlinear extension of Linear Discriminant Analysis (LDA) =-=[8, 3] or "-=-FisherFace" techniques [1] for face recognition. Moreover, our nonlinear generalization has distinct computational/storage advantages over these linear methods for large databases. 2 Analysis of ... |

198 | Discriminant analysis for recognition of human face images
- Etemad, Chellappa
- 1997
(Show Context)
Citation Context ...fficient subspace method for density estimation of highdimensional data [6]. This Bayesian (MAP) approach can also be viewed as a generalized nonlinear extension of Linear Discriminant Analysis (LDA) =-=[8, 3] or "-=-FisherFace" techniques [1] for face recognition. Moreover, our nonlinear generalization has distinct computational/storage advantages over these linear methods for large databases. 2 Analysis of ... |

89 | Face Recognition using View-based and Modular Eigenspaces
- Moghaddam, Pentland
- 1994
(Show Context)
Citation Context ...nition rates obtained by any Figure 4: Standard Eigenfaces. algorithm tested on this database, and that it is lower (by about 10%) than the typical rates that we have obtained with the FERET database =-=[5]-=-. We attribute this lower performance to the fact that these images were selected to be particularly challenging. In fact, using an eigenface method to match the first views of the 76 individuals in t... |

67 |
Face recognition: feature vs templates
- Brunell, Poggio
- 1993
(Show Context)
Citation Context ...age database retrieval often make use of simple image similarity metrics such as Euclidean distance or normalized correlation, which correspond to a standard template-matching approach to recognition =-=[2]-=-. For example, in its simplest form, the similarity measure S(I1 ; I2) between two images I1 and I2 can be set to be inversely proportional to the norm jjI 1 \Gamma I2 jj. Such a simple formulation su... |