Results 1  10
of
175
Design Galleries: A General Approach to Setting Parameters for Computer Graphics and Animation
, 1997
"... Image rendering maps scene parameters to output pixel values; animation maps motioncontrol parameters to trajectory values. Because these mapping functions are usually multidimensional, nonlinear, and discontinuous, #nding input parameters that yield desirable output values is often a painful pr ..."
Abstract

Cited by 193 (3 self)
 Add to MetaCart
Image rendering maps scene parameters to output pixel values; animation maps motioncontrol parameters to trajectory values. Because these mapping functions are usually multidimensional, nonlinear, and discontinuous, #nding input parameters that yield desirable output values is often a painful process of manual tweaking. Interactiveevolution and inverse design are two general methodologies for computerassisted parameter setting in which the computer plays a prominent role. In this paper we present another such methodology.
A general coefficient of similarity and some of its properties
 Biometrics
, 1971
"... Biometrics is currently published by International Biometric Society. Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract

Cited by 112 (0 self)
 Add to MetaCart
Biometrics is currently published by International Biometric Society. Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
ThreeDimensional Face Recognition
, 2005
"... An expressioninvariant 3D face recognition approach is presented. Our basic assumption is that facial expressions can be modelled as isometries of the facial surface. This allows to construct expressioninvariant representations of faces using the bendinginvariant canonical forms approach. The re ..."
Abstract

Cited by 105 (22 self)
 Add to MetaCart
An expressioninvariant 3D face recognition approach is presented. Our basic assumption is that facial expressions can be modelled as isometries of the facial surface. This allows to construct expressioninvariant representations of faces using the bendinginvariant canonical forms approach. The result is an efficient and accurate face recognition algorithm, robust to facial expressions, that can distinguish between identical twins (the first two authors). We demonstrate a prototype system based on the proposed algorithm and compare its performance to classical face recognition methods. The numerical methods employed by our approach do not require the facial surface explicitly. The surface gradients field, or the surface metric, are sufficient for constructing the expressioninvariant representation of any given face. It allows us to perform the 3D face recognition task while avoiding the surface reconstruction stage.
An optimization criterion for generalized discriminant analysis on undersampled problems
 IEEE Trans. Pattern Analysis and Machine Intelligence
, 2004
"... Abstract—An optimization criterion is presented for discriminant analysis. The criterion extends the optimization criteria of the classical Linear Discriminant Analysis (LDA) through the use of the pseudoinverse when the scatter matrices are singular. It is applicable regardless of the relative size ..."
Abstract

Cited by 28 (8 self)
 Add to MetaCart
Abstract—An optimization criterion is presented for discriminant analysis. The criterion extends the optimization criteria of the classical Linear Discriminant Analysis (LDA) through the use of the pseudoinverse when the scatter matrices are singular. It is applicable regardless of the relative sizes of the data dimension and sample size, overcoming a limitation of classical LDA. The optimization problem can be solved analytically by applying the Generalized Singular Value Decomposition (GSVD) technique. The pseudoinverse has been suggested and used for undersampled problems in the past, where the data dimension exceeds the number of data points. The criterion proposed in this paper provides a theoretical justification for this procedure. An approximation algorithm for the GSVDbased approach is also presented. It reduces the computational complexity by finding subclusters of each cluster and uses their centroids to capture the structure of each cluster. This reduced problem yields much smaller matrices to which the GSVD can be applied efficiently. Experiments on text data, with up to 7,000 dimensions, show that the approximation algorithm produces results that are close to those produced by the exact algorithm. Index Terms—Classification, clustering, dimension reduction, generalized singular value decomposition, linear discriminant analysis, text mining. 1
Optimizing Ranking Functions: A Connectionist Approach to Adaptive Information Retrieval
 DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING, THE UNIVERSITY OF CALIFORNIA, SAN DIEGO
, 1994
"... This dissertation examines the use of adaptive methods to automatically improve the performance of ranked text retrieval systems. The goal of a ranked retrieval system is to manage a large collection of text documents and to order documents for a user based on the estimated relevance of the document ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
This dissertation examines the use of adaptive methods to automatically improve the performance of ranked text retrieval systems. The goal of a ranked retrieval system is to manage a large collection of text documents and to order documents for a user based on the estimated relevance of the documents to the user's information need (or query). The ordering enables the user to quickly find documents of interest. Ranked retrieval is a difficult problem because of the ambiguity of natural language, the large size of the collections, and because of the varying needs of users and varying collection characteristics. We propose and empirically validate general adaptive methods which improve the ability of a large class of retrieval systems to rank documents effectively. Our main adaptive method is to numerically optimize free parameters in a retrieval system by minimizing a nonmetric criterion function. The criterion measures how well the system is ranking documents relative to a target ordering, defined by a set of training queries which include the users' desired document orderings. Thus, the system learns parameter settings which better enable it to rank relevant documents before irrelevant. The nonmetric approach is interesting because it is a general adaptive method, an alternative to supervised methods for training neural networks in domains in which rank order or prioritization is important. A second adaptive method is also examined, which is applicable to a restricted class of retrieval systems but which permits an analytic solution. The adaptive methods are applied to a number of problems in text retrieval to validate their utility and practical efficiency. The applications include: A dimensionality reduction of vectorbased document representations to a vector spa...
A Stochastic SelfOrganizing Map for Proximity Data
 Neural Computation
, 1999
"... We derive an efficient algorithm for topographic mapping of proximity data (TMP), which can be seen as an extension of Kohonen's SelfOrganizing Map to arbitrary distance measures. The TMP cost function is derived in a Baysian framework of Folded Markov Chains for the description of autoencoders ..."
Abstract

Cited by 25 (7 self)
 Add to MetaCart
We derive an efficient algorithm for topographic mapping of proximity data (TMP), which can be seen as an extension of Kohonen's SelfOrganizing Map to arbitrary distance measures. The TMP cost function is derived in a Baysian framework of Folded Markov Chains for the description of autoencoders. It incorporates the data via a dissimilarity matrix D and the topographic neighborhood via a matrix H of transition probabilities. From the principle of Maximum Entropy a nonfactorizing Gibbsdistribution is obtained, which is approximated in a meanfield fashion. This allows for Maximum Likelihood estimation using an EM algorithm. In analogy to the transition from Topographic Vector Quantization (TVQ) to the Selforganizing Map (SOM) we suggest an approximation to TMP which is computationally more efficient. In order to prevent convergence to local minima, an annealing scheme in the temperature parameter is introduced, for which the critical temperature of the first phasetransition is calcul...
Information Retrieval Perspective to Nonlinear Dimensionality Reduction for Data Visualization
"... Nonlinear dimensionality reduction methods are often used to visualize highdimensional data, although the existing methods have been designed for other related tasks such as manifold learning. It has been difficult to assess the quality of visualizations since the task has not been welldefined. We ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
Nonlinear dimensionality reduction methods are often used to visualize highdimensional data, although the existing methods have been designed for other related tasks such as manifold learning. It has been difficult to assess the quality of visualizations since the task has not been welldefined. We give a rigorous definition for a specific visualization task, resulting in quantifiable goodness measures and new visualization methods. The task is information retrieval given the visualization: to find similar data based on the similarities shown on the display. The fundamental tradeoff between precision and recall of information retrieval can then be quantified in visualizations as well. The user needs to give the relative cost of missing similar points vs. retrieving dissimilar points, after which the total cost can be measured. We then introduce a new method NeRV (neighbor retrieval visualizer) which produces an optimal visualization by minimizing the cost. We further derive a variant for supervised visualization; class information is taken rigorously into account when computing the similarity relationships. We show empirically that the unsupervised version outperforms existing unsupervised dimensionality reduction methods in the visualization task, and the supervised version outperforms existing supervised methods.
Spectral embedding of graphs
 PATTERN RECOGNITION
, 2003
"... In this paper we explore how to embed symbolic relational graphs with unweighted edges in a patternspace. We adopt a graphspectral approach. We use the leading eigenvectors of the graph adjacency matrix to define eigenmodes of the adjacency matrix. For each eigenmode, we compute vectors of spectra ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
In this paper we explore how to embed symbolic relational graphs with unweighted edges in a patternspace. We adopt a graphspectral approach. We use the leading eigenvectors of the graph adjacency matrix to define eigenmodes of the adjacency matrix. For each eigenmode, we compute vectors of spectral properties. These include the eigenmode perimeter, eigenmode volume, Cheeger number, intermode adjacency matrices and intermode edgedistance. We embed these vectors in a patternspace using two contrasting approaches. The first of these involves performing principal or independent components analysis on the covariance matrix for the spectral pattern vectors. The second approach involves performing multidimensional scaling on the L2 norm for pairs of pattern vectors. We illustrate the utility of the embedding methods on neighbourhood graphs representing the arrangement of corner features in 2D images of 3D polyhedral objects. Two problems are investigated. The first of these is the clustering of graphs representing distinct objects viewed from different directions. The second is the identification of characteristic views of single objects. These two studies reveal that both embedding methods result in wellstructured view spaces for graphdata extracted from 2D views of 3D objects.
Applications of Multidimensional Scaling to Molecular Conformation
, 1997
"... Multidimensional scaling (MDS) is a collection of data analytic techniques for constructing configurations of points from information about interpoint distances. Such constructions arise in computational chemistry when one endeavors to infer the conformation (3dimensional structure) of a molecule fr ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
Multidimensional scaling (MDS) is a collection of data analytic techniques for constructing configurations of points from information about interpoint distances. Such constructions arise in computational chemistry when one endeavors to infer the conformation (3dimensional structure) of a molecule from information about its interatomic distances. For a number of reasons, this application of MDS poses computational challenges not encountered in more traditional applications. In this report we sketch the mathematical formulation of MDS for molecular conformation problems and describe two approaches that can be employed for their solution. 1 Molecular Conformation Consider a molecule with n atoms. We can represent its conformation, or 3dimensional structure, by specifying the coordinates of each atom with respect to a Euclidean coordinate system for ! 3 . We store these coordinates in an n \Theta 3 configuration matrix X. Given X, we can easily compute the matrix of interatomic distan...