Results 1  10
of
396,679
Smooth Boosting and Learning with Malicious Noise
 Journal of Machine Learning Research
, 2003
"... We describe a new boosting algorithm which generates only smooth distributions which do not assign too much weight to any single example. We show that this new boosting algorithm can be used to construct efficient PAC learning algorithms which tolerate relatively high rates of malicious noise. In pa ..."
Abstract

Cited by 55 (5 self)
 Add to MetaCart
We describe a new boosting algorithm which generates only smooth distributions which do not assign too much weight to any single example. We show that this new boosting algorithm can be used to construct efficient PAC learning algorithms which tolerate relatively high rates of malicious noise
Smooth Boosting and Linear Threshold Learning with Malicious Noise
"... We describe a PAC algorithm for learning linear threshold functions when some fraction of the examples used for learning are generated and labeled by an omniscient malicious adversary. The algorithm has complexity bounds similar to the classical Perceptron algorithm but can tolerate a substantial ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
noisetolerant weak learning algorithm, our algorithm can learn successfully despite higher rates of malicious noise than previous approaches. Supported in part by an NSF Graduate Fellowship and by NSF grant CCR9504436. 1 Introduction Learning an unknown linear threshold function from labeled
Goldberg.Genetic Algorithms
 in Search, Optimization, and Machine Learning
, 1989
"... learning with malicious noise and the closure ..."
Noise Trader Risk in Financial Markets
 Jolurnial of Political Economy
, 1990
"... We present a simple overlapping generations model of an asset market in which irrational noise traders with erroneous stochastic beliefs both affect prices and earn higher expected returns. The unpredictability of noise traders ’ beliefs creates a risk in the price of the asset that deters rational ..."
Abstract

Cited by 858 (23 self)
 Add to MetaCart
We present a simple overlapping generations model of an asset market in which irrational noise traders with erroneous stochastic beliefs both affect prices and earn higher expected returns. The unpredictability of noise traders ’ beliefs creates a risk in the price of the asset that deters rational
Calibrating noise to sensitivity in private data analysis
 In Proceedings of the 3rd Theory of Cryptography Conference
, 2006
"... Abstract. We continue a line of research initiated in [10, 11] on privacypreserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function f mapping databases to reals, the socalled true answer is the result of applying f to the datab ..."
Abstract

Cited by 630 (57 self)
 Add to MetaCart
to the database. To protect privacy, the true answer is perturbed by the addition of random noise generated according to a carefully chosen distribution, and this response, the true answer plus noise, is returned to the user. Previous work focused on the case of noisy sums, in which f =P i g(xi), where xi denotes
Multitask Learning
 MACHINE LEARNING
, 1997
"... Multitask Learning is an approach to inductive transfer that improves generalization by using the domain information contained in the training signals of related tasks as an inductive bias. It does this by learning tasks in parallel while using a shared representation; what is learned for each task ..."
Abstract

Cited by 665 (6 self)
 Add to MetaCart
Multitask Learning is an approach to inductive transfer that improves generalization by using the domain information contained in the training signals of related tasks as an inductive bias. It does this by learning tasks in parallel while using a shared representation; what is learned for each task
Instancebased learning algorithms
 Machine Learning
, 1991
"... Abstract. Storing and using specific instances improves the performance of several supervised learning algorithms. These include algorithms that learn decision trees, classification rules, and distributed networks. However, no investigation has analyzed algorithms that use only specific instances to ..."
Abstract

Cited by 1359 (18 self)
 Add to MetaCart
Abstract. Storing and using specific instances improves the performance of several supervised learning algorithms. These include algorithms that learn decision trees, classification rules, and distributed networks. However, no investigation has analyzed algorithms that use only specific instances
Improving generalization with active learning
 Machine Learning
, 1994
"... Abstract. Active learning differs from "learning from examples " in that the learning algorithm assumes at least some control over what part of the input domain it receives information about. In some situations, active learning is provably more powerful than learning from examples ..."
Abstract

Cited by 539 (1 self)
 Add to MetaCart
alone, giving better generalization for a fixed number of training examples. In this article, we consider the problem of learning a binary concept in the absence of noise. We describe a formalism for active concept learning called selective sampling and show how it may be approximately implemented by a
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 596 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias
Gaussian processes for machine learning
 in: Adaptive Computation and Machine Learning
, 2006
"... Abstract. We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn the hyperpar ..."
Abstract

Cited by 631 (2 self)
 Add to MetaCart
Abstract. We give a basic introduction to Gaussian Process regression models. We focus on understanding the role of the stochastic process and how it is used to define a distribution over functions. We present the simple equations for incorporating training data and examine how to learn
Results 1  10
of
396,679