• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,223
Next 10 →

K-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation

by Michal Aharon, et al. , 2006
"... In recent years there has been a growing interest in the study of sparse representation of signals. Using an overcomplete dictionary that contains prototype signal-atoms, signals are described by sparse linear combinations of these atoms. Applications that use sparse representation are many and inc ..."
Abstract - Cited by 935 (41 self) - Add to MetaCart
signal representations. Given a set of training signals, we seek the dictionary that leads to the best representation for each member in this set, under strict sparsity constraints. We present a new method—the K-SVD algorithm—generalizing the u-means clustering process. K-SVD is an iterative method

Manifold regularization: A geometric framework for learning from labeled and unlabeled examples

by Mikhail Belkin, Partha Niyogi, Vikas Sindhwani - JOURNAL OF MACHINE LEARNING RESEARCH , 2006
"... We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner. Some transductive graph learning al ..."
Abstract - Cited by 578 (16 self) - Add to MetaCart
We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner. Some transductive graph learning

Face recognition using laplacianfaces

by Xiaofei He, Shuicheng Yan, Yuxiao Hu, Partha Niyogi, Hong-jiang Zhang - IEEE Transactions on Pattern Analysis and Machine Intelligence , 2005
"... Abstract—We propose an appearance-based face recognition method called the Laplacianface approach. By using Locality Preserving Projections (LPP), the face images are mapped into a face subspace for analysis. Different from Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) wh ..."
Abstract - Cited by 389 (38 self) - Add to MetaCart
approach with Eigenface and Fisherface methods on three different face data sets. Experimental results suggest that the proposed Laplacianface approach provides a better representation and achieves lower error rates in face recognition. Index Terms—Face recognition, principal component analysis, linear

Incremental Learning for Robust Visual Tracking

by David A. Ross, Jongwoo Lim, Ruei-Sung Lin, Ming-Hsuan Yang , 2008
"... Visual tracking, in essence, deals with nonstationary image streams that change over time. While most existing algorithms are able to track objects well in controlled environments, they usually fail in the presence of significant variation of the object’s appearance or surrounding illumination. On ..."
Abstract - Cited by 306 (18 self) - Add to MetaCart
as shape changes or specific lighting conditions) that becomes available during tracking. In this paper, we present a tracking method that incrementally learns a low-dimensional subspace representation, efficiently adapting online to changes in the appearance of the target. The model update, based

Integrating Constraints and Metric Learning in Semi-Supervised Clustering

by Mikhail Bilenko, Sugato Basu, Raymond J. Mooney - In ICML , 2004
"... Semi-supervised clustering employs a small amount of labeled data to aid unsupervised learning. Previous work in the area has utilized supervised data in one of two approaches: 1) constraint-based methods that guide the clustering algorithm towards a better grouping of the data, and 2) distanc ..."
Abstract - Cited by 248 (7 self) - Add to MetaCart
Semi-supervised clustering employs a small amount of labeled data to aid unsupervised learning. Previous work in the area has utilized supervised data in one of two approaches: 1) constraint-based methods that guide the clustering algorithm towards a better grouping of the data, and 2

LexRank: Graph-based lexical centrality as salience in text summarization

by Dragomir R. Radev - Journal of Artificial Intelligence Research , 2004
"... We introduce a stochastic graph-based method for computing relative importance of textual units for Natural Language Processing. We test the technique on the problem of Text Summarization (TS). Extractive TS relies on the concept of sentence salience to identify the most important sentences in a doc ..."
Abstract - Cited by 266 (9 self) - Add to MetaCart
a detailed analysis of our approach and apply it to a larger data set including data from earlier DUC evaluations. We discuss several methods to compute centrality using the similarity graph. The results show that degree-based methods (including LexRank) outperform both centroid-based methods

On kernel target alignment

by Nello Cristianini, Jaz Kandola, Andre Elisseeff, John Shawe-Taylor - ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 14 , 2002
"... Kernel based methods are increasingly being used for data modeling because of their conceptual simplicity and outstanding performance on many tasks. However, the kernel function is often chosen using trial-and-error heuristics. In this paper we address the problem of measuring the degree of agreem ..."
Abstract - Cited by 298 (8 self) - Add to MetaCart
Kernel based methods are increasingly being used for data modeling because of their conceptual simplicity and outstanding performance on many tasks. However, the kernel function is often chosen using trial-and-error heuristics. In this paper we address the problem of measuring the degree

Proximity graphs for clustering and manifold learning

by Miguel Á. Carreira-perpiñán, Richard S. Zemel - In , 2005
"... Many machine learning algorithms for clustering or dimensionality reduction take as input a cloud of points in Euclidean space, and construct a graph with the input data points as vertices. This graph is then partitioned (clustering) or used to redefine metric information (dimensionality reduction). ..."
Abstract - Cited by 38 (3 self) - Add to MetaCart
Many machine learning algorithms for clustering or dimensionality reduction take as input a cloud of points in Euclidean space, and construct a graph with the input data points as vertices. This graph is then partitioned (clustering) or used to redefine metric information (dimensionality reduction

Ridge Regression Learning Algorithm in Dual Variables

by C. Saunders, A. Gammerman, V. Vovk - In Proceedings of the 15th International Conference on Machine Learning , 1998
"... In this paper we study a dual version of the Ridge Regression procedure. It allows us to perform non-linear regression by constructing a linear regression function in a high dimensional feature space. The feature space representation can result in a large increase in the number of parameters used by ..."
Abstract - Cited by 164 (8 self) - Add to MetaCart
In this paper we study a dual version of the Ridge Regression procedure. It allows us to perform non-linear regression by constructing a linear regression function in a high dimensional feature space. The feature space representation can result in a large increase in the number of parameters used

Towards a Theoretical Foundation for Laplacian-Based Manifold Methods

by Mikhail Belkin, Partha Niyogi , 2007
"... In recent years manifold methods have attracted a considerable amount of attention in machine learning. However most algorithms in that class may be termed “manifold-motivated” as they lack any explicit theoretical guarantees. In this paper we take a step towards closing the gap between theory and p ..."
Abstract - Cited by 156 (12 self) - Add to MetaCart
and practice for a class of Laplacian-based manifold methods. These methods utilize the graph Laplacian associated to a data set for a variety of applications in semi-supervised learning, clustering, data representation. We show that under certain conditions the graph Laplacian of a point cloud of data samples
Next 10 →
Results 1 - 10 of 1,223
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University