Results 1  10
of
26
Transductive Learning via Spectral Graph Partitioning
 In ICML
, 2003
"... We present a new method for transductive learning, which can be seen as a transductive version of the k nearestneighbor classifier. ..."
Abstract

Cited by 243 (0 self)
 Add to MetaCart
(Show Context)
We present a new method for transductive learning, which can be seen as a transductive version of the k nearestneighbor classifier.
SurrogateAssisted Evolutionary Optimization Frameworks for HighFidelity Engineering Design Problems
 In Knowledge Incorporation in Evolutionary Computation
, 2004
"... Over the last decade, Evolutionary Algorithms (EAs) have emerged as a powerful paradigm for global optimization of multimodal functions. More recently, there has been significant interest in applying EAs to engineering design problems. However, in many complex engineering design problems where high ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
(Show Context)
Over the last decade, Evolutionary Algorithms (EAs) have emerged as a powerful paradigm for global optimization of multimodal functions. More recently, there has been significant interest in applying EAs to engineering design problems. However, in many complex engineering design problems where highfidelity analysis models are used, each function evaluation may require a Computational Structural Mechanics (CSM), Computational Fluid Dynamics (CFD) or Computational ElectroMagnetics (CEM) simulation costing minutes to hours of supercomputer time. Since EAs typically require thousands of function evaluations to locate a near optimal solution, the use of EAs often becomes computationally prohibitive for this class of problems. In this paper, we present frameworks that employ surrogate models for solving computationally expensive optimization problems on a limited computational budget. In particular, the key factors responsible for the success of these frameworks are discussed. Experimental results obtained on benchmark test functions and realworld complex design problems are presented.
On Transductive Regression
, 2006
"... In many modern largescale learning applications, the amount of unlabeled data far exceeds that of labeled data. A common instance of this problem is the transductive setting where the unlabeled test points are known to the learning algorithm. This paper presents a study of regression problems in th ..."
Abstract

Cited by 22 (1 self)
 Add to MetaCart
(Show Context)
In many modern largescale learning applications, the amount of unlabeled data far exceeds that of labeled data. A common instance of this problem is the transductive setting where the unlabeled test points are known to the learning algorithm. This paper presents a study of regression problems in that setting. It presents explicit VCdimension error bounds for transductive regression that hold for all bounded loss functions and coincide with the tight classification bounds of Vapnik when applied to classification. It also presents a new transductive regression algorithm inspired by our bound that admits a primal and kernelized closedform solution and deals efficiently with large amounts of unlabeled data. The algorithm exploits the position of unlabeled points to locally estimate their labels and then uses a global optimization to ensure robust predictions. Our study also includes the results of experiments with several publicly available regression data sets with up to 20,000 unlabeled examples. The comparison with other transductive regression algorithms shows that it performs well and that it can scale to large data sets. 1
Stability of transductive regression algorithms
 In ICML
, 2008
"... This paper uses the notion of algorithmic stability to derive novel generalization bounds for several families of transductive regression algorithms, both by using convexity and closedform solutions. Our analysis helps compare the stability of these algorithms. It suggests that several existing alg ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
This paper uses the notion of algorithmic stability to derive novel generalization bounds for several families of transductive regression algorithms, both by using convexity and closedform solutions. Our analysis helps compare the stability of these algorithms. It suggests that several existing algorithms might not be stable but prescribes a technique to make them stable. It also reports the results of experiments with local transductive regression demonstrating the benefit of our stability bounds for model selection, in particular for determining the radius of the local neighborhood used by the algorithm. 1.
Margin based transductive graph cuts using linear programming
 Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, (AISTATS 2007
, 2007
"... This paper studies the problem of inferring a partition (or a graph cut) of an undirected deterministic graph where the labels of some nodes are observed thereby bridging a gap between graph theory and probabilistic inference techniques. Given a weighted graph, we focus on the rules of weighted nei ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
(Show Context)
This paper studies the problem of inferring a partition (or a graph cut) of an undirected deterministic graph where the labels of some nodes are observed thereby bridging a gap between graph theory and probabilistic inference techniques. Given a weighted graph, we focus on the rules of weighted neighbors to predict the label of a particular node. A maximum margin and maximal average margin based argument is used to prove a generalization bound, and is subsequently related to the classical MINCUT approach. From a practical perspective a simple and intuitive, but efficient convex formulation is constructed. This scheme can readily be implemented as a linear program which scales well till a few thousands of (labeled or unlabeled) datapoints. The extremal case is studied where one observes only a single label, and this setting is related to the task of unsupervised clustering.
Transductive ranking via pairwise regularized leastsquares
 Workshop on Mining and Learning with Graphs (MLG’07
, 2007
"... Ranking data points with respect to a given preference criterion is an example of a preference learning task. Tasks of this kind are often considered as classification problems, where the training set is composed of data point pairs, in ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Ranking data points with respect to a given preference criterion is an example of a preference learning task. Tasks of this kind are often considered as classification problems, where the training set is composed of data point pairs, in
Y.: Transductive Gaussian process regression with automatic model selection
 In: Proceedings of the 17th European Conference on Machine Learning
, 2006
"... Abstract. In contrast to the standard inductive inference setting of predictive machine learning, in real world learning problems often the test instances are already available at training time. Transductive inference tries to improve the predictive accuracy of learning algorithms by making use of ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract. In contrast to the standard inductive inference setting of predictive machine learning, in real world learning problems often the test instances are already available at training time. Transductive inference tries to improve the predictive accuracy of learning algorithms by making use of the information contained in these test instances. Although this description of transductive inference applies to predictive learning problems in general, most transductive approaches consider the case of classification only. In this paper we introduce a transductive variant of Gaussian process regression with automatic model selection, based on approximate moment matching between training and test data. Empirical results show the feasibility and competitiveness of this approach. 1
Theory and Algorithms for Modern Problems in Machine Learning and an Analysis of Markets
, 2008
"... ..."
(Show Context)
Transductive Regression Piloted by InterManifold Relations
"... In this paper, we present a novel semisupervised regression algorithm working on multiclass data that may lie on multiple manifolds. Unlike conventional manifold regression algorithms that do not consider the class distinction of samples, our method introduces the class information to the regression ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
In this paper, we present a novel semisupervised regression algorithm working on multiclass data that may lie on multiple manifolds. Unlike conventional manifold regression algorithms that do not consider the class distinction of samples, our method introduces the class information to the regression process and tries to exploit the similar configurations shared by the label distribution of multiclass data. To utilize the correlations among data from different classes, we develop a crossmanifold label propagation process and employ labels from different classes to enhance the regression performance. The interclass relations are coded by a set of intermanifold graphs and a regularization item is introduced to impose interclass smoothness on the possible solutions. In addition, the algorithm is further extended with the kernel trick for predicting labels of the outofsample data even without class information. Experiments on both synthesized data and real world problems validate the effectiveness of the proposed framework for semisupervised regression. 1.
i The Regression Model of Machine Translation
, 2011
"... for ful llment of the requirements for the degree of ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
for ful llment of the requirements for the degree of