Results 1 - 10
of
115
Statistical Analysis of Semi-Supervised Regression
"... Semi-supervised methods use unlabeled data in addition to labeled data to construct predictors. While existing semi-supervised methods have shown some promising empirical performance, their development has been based largely based on heuristics. In this paper we study semi-supervised learning from t ..."
Abstract
-
Cited by 42 (1 self)
- Add to MetaCart
Semi-supervised methods use unlabeled data in addition to labeled data to construct predictors. While existing semi-supervised methods have shown some promising empirical performance, their development has been based largely based on heuristics. In this paper we study semi-supervised learning from
Semisupervised regression with order preferences
, 2006
"... Following a discussion on the general form of regularization for semi-supervised learning, we propose a semi-supervised regression algorithm. It is based on the assumption that we have certain order preferences on unlabeled data (e.g., point x1 has a larger target value than x2). Semi-supervised lea ..."
Abstract
-
Cited by 11 (1 self)
- Add to MetaCart
learning consists of enforcing the order preferences as regularization in a risk minimization framework. The optimization problem can be effectively solved by a linear program. Experiments show that the proposed semi-supervised regression outperforms standard regression. 1 Semi-supervised learning
Statistical Analysis of Semi-Supervised Regression
"... Abstract Semi-supervised methods use unlabeled data in addition to labeled data to con-struct predictors. While existing semi-supervised methods have shown some promising empirical performance, their development has been based largely basedon heuristics. In this paper we study semi-supervised learni ..."
Abstract
- Add to MetaCart
Abstract Semi-supervised methods use unlabeled data in addition to labeled data to con-struct predictors. While existing semi-supervised methods have shown some promising empirical performance, their development has been based largely basedon heuristics. In this paper we study semi-supervised
Semi-supervised model selection based on cross-validation
- In International Joint Conference on Neural Networks
, 2006
"... Abstract — We propose a new semi-supervised model selection method that is derived by applying the structural risk minimization principle to a recent semi-supervised generalization error bound. This bound that we build on is based on the crossvalidation estimate underlying the popular cross-validati ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
Abstract — We propose a new semi-supervised model selection method that is derived by applying the structural risk minimization principle to a recent semi-supervised generalization error bound. This bound that we build on is based on the crossvalidation estimate underlying the popular cross
Semi-Supervised Model Selection Based on Cross-Validation
"... Abstract — We propose a new semi-supervised model selection method that is derived by applying the structural risk minimization principle to a recent semi-supervised generalization error bound. This bound that we build on is based on the crossvalidation estimate underlying the popular cross-validati ..."
Abstract
- Add to MetaCart
Abstract — We propose a new semi-supervised model selection method that is derived by applying the structural risk minimization principle to a recent semi-supervised generalization error bound. This bound that we build on is based on the crossvalidation estimate underlying the popular cross
Regression Models for Ordinal Data: A Machine Learning Approach
, 1999
"... In contrast to the standard machine learning tasks of classification and metric regression we investigate the problem of predicting variables of ordinal scale, a setting referred to as ordinal regression. The task of ordinal regression arises frequently in the social sciences and in information retr ..."
Abstract
-
Cited by 19 (4 self)
- Add to MetaCart
of Structural Risk Minimization as employed in Support Vector Machines we derive a new learning algorithm based on large margin rank boundaries for the task of ordinal regression. Our method is easily extended to nonlinear utility functions. We give experimental results for an Information Retrieval task
Using graph-based metrics with empirical risk minimization to speed up active learning on networked data
- Proceedings of the Fifteenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
, 2009
"... Active and semi-supervised learning are important techniques when labeled data are scarce. Recently a method was suggested for combining active learning with a semi-supervised learning algorithm that uses Gaussian fields and harmonic functions. This classifier is relational in nature: it relies on h ..."
Abstract
-
Cited by 19 (1 self)
- Add to MetaCart
on having the data presented as a partially labeled graph (also known as a within-network learning problem). This work showed yet again that empirical risk minimization (ERM) was the best method to find the next instance to label and provided an efficient way to compute ERM with the semisupervised
Stochastic Dominance-based Rough Set Model for Ordinal Classification
"... In order to discover interesting patterns and dependencies in data, an approach based on rough set theory can be used. In particular, Dominance-based Rough Set Approach (DRSA) has been introduced to deal with the problem of ordinal classification with monotonicity constraints (also referred to as mu ..."
Abstract
-
Cited by 2 (2 self)
- Add to MetaCart
the equivalence of the variable consistency rough sets to the specific empirical risk-minimizing decision rule in the statistical decision theory. 1
Multi-class Semi-supervised SVMs with Positiveness Exclusive Regularization
"... In this work, we address the problem of multi-class clas-sification problem in semi-supervised setting. A regularized multi-task learning approach is presented to train multi-ple binary-class Semi-Supervised Support Vector Machines (S3VMs) using the one-vs-rest strategy within a joint frame-work. A ..."
Abstract
- Add to MetaCart
classifiers. That is, we expect an exclusive relationship among different S3VMs for evaluating the same unlabeled sample. We propose to use an 1,2-norm regularizer as an implementation of PER. The objective of our approach is to minimize an empirical risk regularized by a PER term and a manifold
Adaptive Memory-Based Regression Methods
- in In Proceedings of the 1998 IEEE International Joint Conference on Neural Networks
, 1998
"... Memory-based methods obtain accurate predictions from empirical data without explicitly modeling the underlying process. For each query, a local model is first tailored on the query itself, then used to perform the prediction, and finally discarded. In this paper, we consider local models which are ..."
Abstract
-
Cited by 5 (3 self)
- Add to MetaCart
Memory-based methods obtain accurate predictions from empirical data without explicitly modeling the underlying process. For each query, a local model is first tailored on the query itself, then used to perform the prediction, and finally discarded. In this paper, we consider local models which
Results 1 - 10
of
115