• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 562
Next 10 →

Training Linear SVMs in Linear Time

by Thorsten Joachims , 2006
"... Linear Support Vector Machines (SVMs) have become one of the most prominent machine learning techniques for high-dimensional sparse data commonly encountered in applications like text classification, word-sense disambiguation, and drug design. These applications involve a large number of examples n ..."
Abstract - Cited by 549 (6 self) - Add to MetaCart
Linear Support Vector Machines (SVMs) have become one of the most prominent machine learning techniques for high-dimensional sparse data commonly encountered in applications like text classification, word-sense disambiguation, and drug design. These applications involve a large number of examples n

1 Mixing Linear SVMs for Nonlinear Classification

by Zhouyu Fu, Antonio Robles-kelly, Senior Member, Jun Zhou
"... Abstract—In this paper, we address the problem of combining linear Support Vector Machines (SVMs) for classification of large-scale non-linear data sets. The motivation is to exploit both the efficiency of linear SVMs in learning and prediction and the power of non-linear SVMs in classification. To ..."
Abstract - Cited by 9 (1 self) - Add to MetaCart
Abstract—In this paper, we address the problem of combining linear Support Vector Machines (SVMs) for classification of large-scale non-linear data sets. The motivation is to exploit both the efficiency of linear SVMs in learning and prediction and the power of non-linear SVMs in classification

COFFIN: A Computational Framework for Linear SVMs

by Soeren Sonnenburg, Vojtěch Franc , 2010
"... In a variety of applications, kernel machines such as Support Vector Machines (SVMs) have been used with great success often delivering state-of-the-art results. Using the kernel trick, they work on several domains and even enable heterogeneous data fusion by concatenating feature spaces or multiple ..."
Abstract - Cited by 21 (0 self) - Add to MetaCart
or multiple kernel learning. Unfortunately, they are not suited for truly large-scale applications since they suffer from the curse of supporting vectors, i.e., the speed of applying SVMs decays linearly with the number of support vectors. In this paper we develop COFFIN — a new training strategy for linear

Large Scale Semi-supervised Linear SVMs

by Vikas Sindhwani, et al. , 2006
"... Large scale learning is often realistic only in a semi-supervised setting where a small set of labeled examples is available together with a large collection of unlabeled data. In many information retrieval and data mining applications, linear classifiers are strongly preferred because of their ease ..."
Abstract - Cited by 75 (9 self) - Add to MetaCart
Large scale learning is often realistic only in a semi-supervised setting where a small set of labeled examples is available together with a large collection of unlabeled data. In many information retrieval and data mining applications, linear classifiers are strongly preferred because

Newton Methods for Fast Solution of Semisupervised Linear SVMs

by Vikas Sindhwani, S. Sathiya Keerthi
"... In this chapter, we present a family of semi-supervised linear support vector classifiers that are designed to handle partially-labeled sparse datasets with possibly very large number of examples and features. At their core, our algorithms employ recently developed Modified Finite Newton techniques. ..."
Abstract - Cited by 7 (0 self) - Add to MetaCart
In this chapter, we present a family of semi-supervised linear support vector classifiers that are designed to handle partially-labeled sparse datasets with possibly very large number of examples and features. At their core, our algorithms employ recently developed Modified Finite Newton techniques

ANOMALY DETECTION IN COMPUTER NETWORKS USING LINEAR SVMs

by Carolina Fortuna, Blaž Fortuna, Mihael Mohorčič
"... Modern computer networks are subject to various malicious attacks. Since attacks are becoming more sophisticated and networks are becoming larger there is a need for an efficient intrusion detection systems (IDSs) that can distinguish between legitimate and illegitimate traffic and be able to signal ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
to signal attacks in real time, before serious damages are produced. In this paper we use linear support vector machines (SVMs) for detecting abnormal traffic patterns in the KDD Cup 1999 data. The IDS system is supposed to distinguish normal traffic from intrusions and to classify the intrusions into four

A modified finite newton method for fast solution of large scale linear svms

by S. Sathiya Keerthi, Dennis Decoste, Thorsten Joachims - Journal of Machine Learning Research , 2005
"... This paper develops a fast method for solving linear SVMs with L2 loss function that is suited for large scale data mining tasks such as text classification. This is done by modifying the finite Newton method of Mangasarian in several ways. Experiments indicate that the method is much faster than de ..."
Abstract - Cited by 109 (8 self) - Add to MetaCart
This paper develops a fast method for solving linear SVMs with L2 loss function that is suited for large scale data mining tasks such as text classification. This is done by modifying the finite Newton method of Mangasarian in several ways. Experiments indicate that the method is much faster than

A Sequential Dual Method for Large Scale Multi-Class Linear SVMs

by S. Sathiya Keerthi, Kai-Wei Chang, et al. , 2008
"... Efficient training of direct multi-class formulations of linear Support Vector Machines is very useful in applications such as text classification with a huge number examples as well as features. This paper presents a fast dual method for this training. The main idea is to sequentially traverse thro ..."
Abstract - Cited by 40 (8 self) - Add to MetaCart
Efficient training of direct multi-class formulations of linear Support Vector Machines is very useful in applications such as text classification with a huge number examples as well as features. This paper presents a fast dual method for this training. The main idea is to sequentially traverse

Why do linear SVMs trained on HOG features perform so well?

by Hilton Bristow, Simon Lucey
"... ..."
Abstract - Add to MetaCart
Abstract not found

Cutting-Plane Training of Structural SVMs

by Thorsten Joachims, Thomas Finley, Chun-nam John Yu , 2007
"... Discriminative training approaches like structural SVMs have shown much promise for building highly complex and accurate models in areas like natural language processing, protein structure prediction, and information retrieval. However, current training algorithms are computationally expensive or i ..."
Abstract - Cited by 321 (10 self) - Add to MetaCart
or intractable on large datasets. To overcome this bottleneck, this paper explores how cutting-plane methods can provide fast training not only for classification SVMs, but also for structural SVMs. In particular, we show that in an equivalent “1-slack” reformulation of the linear SVM training problem, our
Next 10 →
Results 1 - 10 of 562
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University