Results 11 - 20
of
11,273
Multi-label linear discriminant analysis
- In ECCV
"... Abstract. Multi-label problems arise frequently in image and video an-notations, and many other related applications such as multi-topic text categorization, music classification, etc. Like other computer vision tasks, multi-label image and video annotations also suffer from the difficulty of high d ..."
Abstract
-
Cited by 16 (10 self)
- Add to MetaCart
dimensionality because images often have a large number of features. Linear discriminant analysis (LDA) is a well-known method for dimen-sionality reduction. However, the classical Linear Discriminant Analysis (LDA) only works for single-label multi-class classifications and cannot be directly applied to multi-label
LEARNING FROM MULTI-LABEL DATA
, 2009
"... This volume contains research papers accepted for presentation at the 1st International Workshop on Learning from Multi-Label Data (MLD’09), which will be held in Bled, Slovenia, at September 7, 2009 in conjunction with ECML/PKDD 2009. MLD’09 is devoted to multi-label learning, which is an emerging ..."
Abstract
-
Cited by 11 (2 self)
- Add to MetaCart
, such as classification, ranking, semi-supervised learning, active learning, multi-instance learning, dimensionality reduction, etc. Initial attempts on multi-label learning date back to 1999 with works on multi-label text categorization. In recent years, the task of learning from multi-label data has been addressed by a
Semi-Supervised Dimension Reduction for Multi-label Classification
"... A significant challenge to make learning techniques more suitable for general purpose use in AI is to move beyond i) complete supervision, ii) low dimensional data and iii) a single label per instance. Solving this challenge would allow making predictions for high dimensional large dataset with mult ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
with multiple (but possibly incomplete) labelings. While other work has addressed each of these problems separately, in this paper we show how to address them together, namely the problem of semi-supervised dimension reduction for multi-labeled classification, SSDR-MC. To our knowledge this is the first paper
Mulan: A Java Library for Multi-Label Learning
- Journal of Machine Learning Research
, 2012
"... MULAN is a Java library for learning from multi-label data. It offers a variety of classification, ranking, thresholding and dimensionality reduction algorithms, as well as algorithms for learning from hierarchically structured labels. In addition, it contains an evaluation framework that calculates ..."
Abstract
-
Cited by 52 (4 self)
- Add to MetaCart
MULAN is a Java library for learning from multi-label data. It offers a variety of classification, ranking, thresholding and dimensionality reduction algorithms, as well as algorithms for learning from hierarchically structured labels. In addition, it contains an evaluation framework
Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds
- Journal of Machine Learning Research
, 2003
"... The problem of dimensionality reduction arises in many fields of information processing, including machine learning, data compression, scientific visualization, pattern recognition, and neural computation. ..."
Abstract
-
Cited by 385 (10 self)
- Add to MetaCart
The problem of dimensionality reduction arises in many fields of information processing, including machine learning, data compression, scientific visualization, pattern recognition, and neural computation.
Multi-Label Informed Latent Semantic Indexing
, 2005
"... Latent semantic indexing (LSI) is a well-known unsupervised approach for dimensionality reduction in information retrieval. However if the output information (i.e. category labels) is available, it is often beneficial to derive the indexing not only based on the inputs but also on the target values ..."
Abstract
-
Cited by 52 (2 self)
- Add to MetaCart
Latent semantic indexing (LSI) is a well-known unsupervised approach for dimensionality reduction in information retrieval. However if the output information (i.e. category labels) is available, it is often beneficial to derive the indexing not only based on the inputs but also on the target values
Locality Preserving Projection,"
- Neural Information Processing System,
, 2004
"... Abstract Many problems in information processing involve some form of dimensionality reduction. In this paper, we introduce Locality Preserving Projections (LPP). These are linear projective maps that arise by solving a variational problem that optimally preserves the neighborhood structure of the ..."
Abstract
-
Cited by 414 (16 self)
- Add to MetaCart
Abstract Many problems in information processing involve some form of dimensionality reduction. In this paper, we introduce Locality Preserving Projections (LPP). These are linear projective maps that arise by solving a variational problem that optimally preserves the neighborhood structure
A Growing Neural Gas Network Learns Topologies
- Advances in Neural Information Processing Systems 7
, 1995
"... An incremental network model is introduced which is able to learn the important topological relations in a given set of input vectors by means of a simple Hebb-like learning rule. In contrast to previous approaches like the "neural gas" method of Martinetz and Schulten (1991, 1994), this m ..."
Abstract
-
Cited by 401 (5 self)
- Add to MetaCart
data is available but no information on the desired output. What can the goal of learning be in this situation? One possible objective is dimensionality reduction: finding a low-dimensional subspace of the input vector space containing most or all of the input data. Linear subspaces with this property
Multi-Label Sparse Coding for Automatic Image Annotation
"... In this paper, we present a multi-label sparse coding framework for feature extraction and classification within the context of automatic image annotation. First, each image is encoded into a so-called supervector, derived from the universal Gaussian Mixture Models on orderless image patches. Then, ..."
Abstract
-
Cited by 28 (2 self)
- Add to MetaCart
, a label sparse coding based subspace learning algorithm is derived to effectively harness multilabel information for dimensionality reduction. Finally, the sparse coding method for multi-label data is proposed to propagate the multi-labels of the training images to the query image with the sparse ℓ
Multi-label Prediction via Sparse Infinite CCA
"... Canonical Correlation Analysis (CCA) is a useful technique for modeling dependencies between two (or more) sets of variables. Building upon the recently suggested probabilistic interpretation of CCA, we propose a nonparametric, fully Bayesian framework that can automatically select the number of cor ..."
Abstract
-
Cited by 24 (2 self)
- Add to MetaCart
of correlation components, and effectively capture the sparsity underlying the projections. In addition, given (partially) labeled data, our algorithm can also be used as a (semi)supervised dimensionality reduction technique, and can be applied to learn useful predictive features in the context of learning a set
Results 11 - 20
of
11,273