Results 1 
7 of
7
Marginal semisupervised submanifold projectionswith informative constraints for dimensionality reduction and recognition
 Neural Networks
, 2012
"... Abstract — In this work, submanifold projections based semisupervised dimensionality reduction (DR) problem learning from partial constrained data is discussed. Two semisupervised DR algorithms termed Marginal SemiSupervised SubManifold Projections (MS3MP) and orthogonal MS3MP (OMS3MP) are prop ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Abstract — In this work, submanifold projections based semisupervised dimensionality reduction (DR) problem learning from partial constrained data is discussed. Two semisupervised DR algorithms termed Marginal SemiSupervised SubManifold Projections (MS3MP) and orthogonal MS3MP (OMS3MP) are proposed. MS3MP in singular case is also discussed. We also present the weighted least squares view of MS3MP. Based on specifying the types of neighborhoods with pairwise constraints (PC) and the defined manifold scatters, our methods can preserve the local properties of all points and discriminant structures embedded in the localized PC. The submanifolds of different classes can also be separated. In PC guided methods, exploring selecting the informative constraints is challenging and random constraint subsets significantly affect the performance of algorithms. This paper also introduces an effective technique to select the informative constraints for DR with consistent constraints. The analytic form of the projection axes can be obtained by eigendecomposition. The connections between this work and other related work are also elaborated. The validity of the proposed constraint selection approach and DR algorithms are evaluated by benchmark problems. Extensive simulations show that our algorithms can deliver promising results over some widely used stateoftheart semisupervised DR techniques. Index Terms — semisupervised learning, marginal projections, dimensionality reduction, informative constraints, image recognition 1
Efficient Dimensionality Reduction for Canonical Correlation Analysis
"... We present a fast algorithm for approximate Canonical Correlation Analysis (CCA). Given a pair of tallandthin matrices, the proposed algorithm first employs a randomized dimensionality reduction transform to reduce the size of the input matrices, and then applies any standard CCA algorithm to the ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
We present a fast algorithm for approximate Canonical Correlation Analysis (CCA). Given a pair of tallandthin matrices, the proposed algorithm first employs a randomized dimensionality reduction transform to reduce the size of the input matrices, and then applies any standard CCA algorithm to the new pair of matrices. The algorithm computes an approximate CCA to the original pair of matrices with provable guarantees, while requiring asymptotically less operations than the stateoftheart exact algorithms. 1.
Efficient Implementations Dimensionality Reduction Algorithms in our
"... • One of the challenges in multilabel learning is how to capture the correlation among different labels in dimensionality reduction. • Multilabel learning from highdimensional data suffers from the curse of dimensionality. • For largescale problems, one of the key challenges is how to perform di ..."
Abstract
 Add to MetaCart
• One of the challenges in multilabel learning is how to capture the correlation among different labels in dimensionality reduction. • Multilabel learning from highdimensional data suffers from the curse of dimensionality. • For largescale problems, one of the key challenges is how to perform dimensionality reduction efficiently? Contributions • The design and analysis of dimensionality reduction algorithms for multilabel learning: • Proposed a novel multilabel dimensionality reduction algorithm called hypergraph spectral learning which uses a hypergraph to capture label correlation (KDD’08) • Extended Canonical Correlation Analysis (CCA) and elucidated key properties of CCA (TPAMI’11) • Proposed two efficient algorithms to solve a class of dimensionality reduction algorithms: • A direct least squares approach(ICML’09) • A twostage approach (KDD’10) Hypergraph Spectral Learning What is a hypergraph? 1. A hypergraph is a generalization of the traditional graph. 2. In a hypergraph, each hyperedge is a nonempty subset of the vertex set. 2. Intuitively, data points sharing many common labels tend to be close to each other in the embedded space. 3. CCA can be shown to be a special case of HSL. Projection onto 1dimensional space using HSL
Dimensionality reduction Pairwise constraints Locality preservation
"... the hem st s rvin over LPP. To keep the intrinsic proximity relations of interclass and intraclass similarity pairs, the localized pairwise CannotLink and MustLink constraints are applied to specify the types of those neighboring pairs. By utilizing the CMLP criterion, margins between inter and ..."
Abstract
 Add to MetaCart
(Show Context)
the hem st s rvin over LPP. To keep the intrinsic proximity relations of interclass and intraclass similarity pairs, the localized pairwise CannotLink and MustLink constraints are applied to specify the types of those neighboring pairs. By utilizing the CMLP criterion, margins between inter and intraclass clusters are a has n the undan, etc. R between the features. LDA as a unimodal method tends to E are of LE they ojections behavior of LE, LLE, LPP and NPE is that the projections of all Contents lists available at SciVerse ScienceDirect w. Pattern Rec Pattern Recognition 45 (2012) 4466–4493similarity pairs. The major reason is they are naturallyEmail addresses:
Contents lists available at ScienceDirect
"... Learning from normalized local and global discriminative information for semisupervised regression and ..."
Abstract
 Add to MetaCart
(Show Context)
Learning from normalized local and global discriminative information for semisupervised regression and
Soft label
"... h i g h l i g h t s • A label propagation procedure is proposed to handle the outliers and dataset with multidensity distribution. • A soft label based LDA is proposed to handle the outofsample problem of label propagation. • A fast solution for solving SLLDA is presented based on a weighted and ..."
Abstract
 Add to MetaCart
(Show Context)
h i g h l i g h t s • A label propagation procedure is proposed to handle the outliers and dataset with multidensity distribution. • A soft label based LDA is proposed to handle the outofsample problem of label propagation. • A fast solution for solving SLLDA is presented based on a weighted and regularized least square. • A flexible version of SLLDA is proposed for better cope with the data sampled from a nonlinear manifold. • Simulation results show the efficiency and effectiveness of the proposed methods. a r t i c l e i n f o Article history: