Results 1  10
of
230
Semisupervised learning by sparse representation
 SIAM International Conference on Data Mining, SDM
"... In this paper, we present a novel semisupervised learning framework based on ℓ1 graph. The ℓ1 graph is motivated by that each datum can be reconstructed by the sparse linear superposition of the training data. The sparse reconstruction coefficients, used to deduce the weights of the directed ℓ1 gra ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
of the ℓ1 graph is derived simultaneously and in a parameterfree manner. Illuminated by the validated discriminating power of sparse representation in [16], we propose a semisupervised learning framework based on ℓ1 graph to utilize both labeled and unlabeled data for inference on a graph. Extensive
SemiSupervised Learning with Graphs
 CARNEGIE MELLON UNIVERSITY
, 2005
"... In traditional machine learning approaches to classification, one uses only a labeled set to train the classifier. Labeled instances however are often difficult, expensive, or time consuming to obtain, as they require the efforts of experienced human annotators. Meanwhile unlabeled data may be relat ..."
Abstract

Cited by 112 (0 self)
 Add to MetaCart
, it is of great interest both in theory and in practice. We present a series of novel semisupervised learning approaches arising from a graph representation, where labeled and unlabeled instances are represented as vertices, and edges encode the similarity between instances. They address the following questions
Manifold regularization: A geometric framework for learning from labeled and unlabeled examples
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semisupervised framework that incorporates labeled and unlabeled data in a generalpurpose learner. Some transductive graph learning al ..."
Abstract

Cited by 578 (16 self)
 Add to MetaCart
We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semisupervised framework that incorporates labeled and unlabeled data in a generalpurpose learner. Some transductive graph learning
Regularization and semisupervised learning on large graphs
 In COLT
, 2004
"... Abstract. We consider the problem of labeling a partially labeled graph. This setting may arise in a number of situations from survey sampling to information retrieval to pattern recognition in manifold settings. It is also of potential practical importance, when the data is abundant, but labeling i ..."
Abstract

Cited by 148 (1 self)
 Add to MetaCart
Abstract. We consider the problem of labeling a partially labeled graph. This setting may arise in a number of situations from survey sampling to information retrieval to pattern recognition in manifold settings. It is also of potential practical importance, when the data is abundant, but labeling
Semisupervised Learning using Sparse Eigenfunction Bases
"... We present a new framework for semisupervised learning with sparse eigenfunction bases of kernel matrices. It turns out that when the data has clustered, that is, when the high density regions are sufficiently separated by low density valleys, each high density area corresponds to a unique represen ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
We present a new framework for semisupervised learning with sparse eigenfunction bases of kernel matrices. It turns out that when the data has clustered, that is, when the high density regions are sufficiently separated by low density valleys, each high density area corresponds to a unique
Selftaught learning: Transfer learning from unlabeled data
 Proceedings of the Twentyfourth International Conference on Machine Learning
, 2007
"... We present a new machine learning framework called “selftaught learning ” for using unlabeled data in supervised classification tasks. We do not assume that the unlabeled data follows the same class labels or generative distribution as the labeled data. Thus, we would like to use a large number of ..."
Abstract

Cited by 299 (20 self)
 Add to MetaCart
of unlabeled images (or audio samples, or text documents) randomly downloaded from the Internet to improve performance on a given image (or audio, or text) classification task. Such unlabeled data is significantly easier to obtain than in typical semisupervised or transfer learning settings, making selftaught
SemiSupervised Learning with Manifold Fitted Graphs
"... In this paper, we propose a localityconstrained and sparsityencouraged manifold fitting approach, aiming at capturing the locally sparse manifold structure into neighborhood graph construction by exploiting a principled optimization model. The proposed model formulates neighborhood graph construct ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
and effectiveness of Mfitted graphs, we leverage graphbased semisupervised learning as the testbed. Extensive experiments carried out on six benchmark datasets validate that the proposed Mfitted graph is superior to stateoftheart neighborhood graphs in terms of classification accuracy using popular graph
Wasserstein Propagation for SemiSupervised Learning
"... Probability distributions and histograms are natural representations for product ratings, traffic measurements, and other data considered in many machine learning applications. Thus, this paper introduces a technique for graphbased semisupervised learning of histograms, derived from the theory o ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Probability distributions and histograms are natural representations for product ratings, traffic measurements, and other data considered in many machine learning applications. Thus, this paper introduces a technique for graphbased semisupervised learning of histograms, derived from the theory
On the effectiveness of Laplacian normalization for graph semisupervised learning
 Journal of Machine Learning Research
"... This paper investigates the effect of Laplacian normalization in graphbased semisupervised learning. To this end, we consider multiclass transductive learning on graphs with Laplacian regularization. Generalization bounds are derived using geometric properties of the graph. Specifically, by intro ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
This paper investigates the effect of Laplacian normalization in graphbased semisupervised learning. To this end, we consider multiclass transductive learning on graphs with Laplacian regularization. Generalization bounds are derived using geometric properties of the graph. Specifically
SEMISUPERVISED LEARNING WITH SPECTRAL GRAPH WAVELETS
"... We consider the transductive learning problem when the labels belong to a continuous space. Through the use of spectral graph wavelets, we explore the benefits of multiresolution analysis on a graph constructed from the labeled and unlabeled data. The spectral graph wavelets behave like discrete mul ..."
Abstract
 Add to MetaCart
We consider the transductive learning problem when the labels belong to a continuous space. Through the use of spectral graph wavelets, we explore the benefits of multiresolution analysis on a graph constructed from the labeled and unlabeled data. The spectral graph wavelets behave like discrete
Results 1  10
of
230