• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 777,461
Next 10 →

Large Margin Classification Using the Perceptron Algorithm

by Yoav Freund, Robert E. Schapire - Machine Learning , 1998
"... We introduce and analyze a new algorithm for linear classification which combines Rosenblatt 's perceptron algorithm with Helmbold and Warmuth's leave-one-out method. Like Vapnik 's maximal-margin classifier, our algorithm takes advantage of data that are linearly separable with large ..."
Abstract - Cited by 518 (2 self) - Add to MetaCart
We introduce and analyze a new algorithm for linear classification which combines Rosenblatt 's perceptron algorithm with Helmbold and Warmuth's leave-one-out method. Like Vapnik 's maximal-margin classifier, our algorithm takes advantage of data that are linearly separable

WordNet: An on-line lexical database

by George A. Miller, Richard Beckwith, Christiane Fellbaum, Derek Gross, Katherine Miller - International Journal of Lexicography , 1990
"... WordNet is an on-line lexical reference system whose design is inspired by current ..."
Abstract - Cited by 1945 (9 self) - Add to MetaCart
WordNet is an on-line lexical reference system whose design is inspired by current

The Perceptron: A Probabilistic Model for Information Storage and Organization in The Brain

by F. Rosenblatt - Psychological Review , 1958
"... If we are eventually to understand the capability of higher organisms for perceptual recognition, generalization, recall, and thinking, we must first have answers to three fundamental questions: 1. How is information about the physical world sensed, or detected, by the biological system? 2. In what ..."
Abstract - Cited by 1143 (0 self) - Add to MetaCart
If we are eventually to understand the capability of higher organisms for perceptual recognition, generalization, recall, and thinking, we must first have answers to three fundamental questions: 1. How is information about the physical world sensed, or detected, by the biological system? 2. In what form is information stored, or remembered? 3. How does information contained in storage, or in memory, influence recognition and behavior? The first of these questions is in the

A training algorithm for optimal margin classifiers

by Bernhard E. Boser, et al. - PROCEEDINGS OF THE 5TH ANNUAL ACM WORKSHOP ON COMPUTATIONAL LEARNING THEORY , 1992
"... A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjust ..."
Abstract - Cited by 1848 (44 self) - Add to MetaCart
A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters

Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with Perceptron Algorithms

by Michael Collins , 2002
"... We describe new algorithms for training tagging models, as an alternative to maximum-entropy models or conditional random fields (CRFs). The algorithms rely on Viterbi decoding of training examples, combined with simple additive updates. We describe theory justifying the algorithms through a modific ..."
Abstract - Cited by 641 (16 self) - Add to MetaCart
We describe new algorithms for training tagging models, as an alternative to maximum-entropy models or conditional random fields (CRFs). The algorithms rely on Viterbi decoding of training examples, combined with simple additive updates. We describe theory justifying the algorithms through a

Large margin methods for structured and interdependent output variables

by Ioannis Tsochantaridis, Thorsten Joachims, Thomas Hofmann, Yasemin Altun - JOURNAL OF MACHINE LEARNING RESEARCH , 2005
"... Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary ..."
Abstract - Cited by 612 (12 self) - Add to MetaCart
to accomplish this, we propose to appropriately generalize the well-known notion of a separation margin and derive a corresponding maximum-margin formulation. While this leads to a quadratic program with a potentially prohibitive, i.e. exponential, number of constraints, we present a cutting plane algorithm

Very simple classification rules perform well on most commonly used datasets

by Robert C. Holte - Machine Learning , 1993
"... The classification rules induced by machine learning systems are judged by two criteria: their classification accuracy on an independent test set (henceforth "accuracy"), and their complexity. The relationship between these two criteria is, of course, of keen interest to the machin ..."
Abstract - Cited by 542 (5 self) - Add to MetaCart
The classification rules induced by machine learning systems are judged by two criteria: their classification accuracy on an independent test set (henceforth "accuracy"), and their complexity. The relationship between these two criteria is, of course, of keen interest

Gaussian processes for machine learning

by Carl Edward Rasmussen - in: Adaptive Computation and Machine Learning , 2006
"... Abstract. We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperpar ..."
Abstract - Cited by 631 (2 self) - Add to MetaCart
the hyperparameters using the marginal likelihood. We explain the practical advantages of Gaussian Process and end with conclusions and a look at the current trends in GP work. Supervised learning in the form of regression (for continuous outputs) and classification (for discrete outputs) is an important constituent

Rules, discretion, and reputation in a model of monetary policy

by Robert J. Barro, David B. Gordon - JOURNAL OF MONETARY ECONOMICS , 1983
"... In a discretionary regime the monetary authority can print more money and create more inflation than people expect. But, although these inflation surprises can have some benefits, they cannot arise systematically in equilibrium when people understand the policymakor's incentives and form their ..."
Abstract - Cited by 794 (9 self) - Add to MetaCart
the policymaker and the private agents, it is possible that reputational forces can substitute for formal rules. Here, we develop an example of a reputational equilibrium where the outcomes turn out to be weighted averages of those from discretion and those from the ideal rule. In particular, the rates

Machine Learning in Automated Text Categorization

by Fabrizio Sebastiani - ACM COMPUTING SURVEYS , 2002
"... The automated categorization (or classification) of texts into predefined categories has witnessed a booming interest in the last ten years, due to the increased availability of documents in digital form and the ensuing need to organize them. In the research community the dominant approach to this p ..."
Abstract - Cited by 1658 (22 self) - Add to MetaCart
to this problem is based on machine learning techniques: a general inductive process automatically builds a classifier by learning, from a set of preclassified documents, the characteristics of the categories. The advantages of this approach over the knowledge engineering approach (consisting in the manual
Next 10 →
Results 1 - 10 of 777,461
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University