Results 1  10
of
181
A Polynomialtime Algorithm for Learning Noisy Linear Threshold Functions
, 1996
"... In this paper we consider the problem of learning a linear threshold function (a halfspace in n dimensions, also called a "perceptron"). Methods for solving this problem generally fall into two categories. In the absence of noise, this problem can be formulated as a Linear Program and s ..."
Abstract

Cited by 73 (11 self)
 Add to MetaCart
In this paper we consider the problem of learning a linear threshold function (a halfspace in n dimensions, also called a "perceptron"). Methods for solving this problem generally fall into two categories. In the absence of noise, this problem can be formulated as a Linear Program
Lower bounds for the noisy broadcast problem
 In Proceedings of the 46 th IEEE Symposium on Foundations of Computer Science (FOCS 2005
, 2005
"... We prove the first nontrivial (super linear) lower bound in the noisy broadcast model, defined by El Gamal in [6]. In this model there are n + 1 processors P0, P1,..., Pn, each of which is initially given a private input bit xi. The goal is for P0 to learn the value of f(x1,..., xn), for some speci ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
− n −α for a constant parameter α> 0 (this bound applies to all threshold functions, as well as any other booleanvalued function with linear sensitivity). This bound also follows by reduction from a lower bound of Ω(n log n) on the depth of generalized noisy decision trees that compute the same
Optimal AttributeEfficient Learning Of Disjunction, Parity, And Threshold Functions
 In EuroCOLT' 97, LNAI 1208
, 1997
"... Decision trees are a very general computation model. Here the problem is to identify a Boolean function f out of a given set of Boolean functions F by asking for the value of f at adaptively chosen inputs. For classes F consisting of functions which may be obtained from one function g on n inputs by ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
by replacing arbitrary n \Gamma k inputs by given constants this problem is known as attributeefficient learning with k essential attributes. Results on general classes of functions are known. More precise and often optimal results are presented for the cases where g is one of the functions disjunction
Equivalence between Learning in Perceptrons with Noisy Examples and Tree Committee Machines
, 1995
"... We study learning from single presentation of examples (incremental or online learning) in singlelayer perceptrons and tree committee machines (TCMs). Lower bounds for the perceptron generalization error as a function of the noise level ffl in the teacher output are calculated. We find that optim ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
weight in the TCM. We also show that online learning is possible even in the K !1 limit, but with the generalization error decaying as ff \Gamma1=2 cm . The simple Hebb rule can also be applied to the TCM, but now the error decays as ff \Gamma1=2 cm for finite K and ff \Gamma1=4 cm for K ! 1
1 NearOracle Performance of Greedy BlockSparse Estimation Techniques from Noisy Measurements
"... ar ..."
Minimax Number of Strata for Online Stratified Sampling given Noisy Samples
 ξK,n(δ) = ⋂ 1≤k≤K, 2≤t≤n ∣∣∣∣∣ √√√√ 1 t− 1 t∑ i=1 ( Xk,i − 1 t t∑ j=1 Xk,j )2 − σk ∣∣∣∣∣ ≤ A √ t , (9) where A = 2 √ (1 + 3b+ 4V̄ ) log(2nK/δ). Then Pr(ξ
, 2012
"... Abstract. We consider the problem of online stratified sampling for Monte Carlo integration of a function given a finite budget of n noisy evaluations to the function. More precisely we focus on the problem of choosing the number of strata K as a function of the budget n. We provide asymptotic and f ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract. We consider the problem of online stratified sampling for Monte Carlo integration of a function given a finite budget of n noisy evaluations to the function. More precisely we focus on the problem of choosing the number of strata K as a function of the budget n. We provide asymptotic
IEEE TRANSACTIONS ON INFORMATION THEORY (SUBMITTED) 1 Noisy Matrix Completion under Sparse Factor Models
"... ar ..."
On the (im)possibility of basing oblivious transfer and bit commitment on weakened security assumptions
 ADVANCES IN CRYPTOLOGY – PROCEEDINGS OF EUROCRYPT 99, LNCS 1592
, 1999
"... We consider the problem of basing Oblivious Transfer (OT) and Bit Commitment (BC), with information theoretic security, on seemingly weaker primitives. We introduce a general model for describing such primitives, called Weak Generic Transfer (WGT). This model includes as important special cases We ..."
Abstract

Cited by 58 (6 self)
 Add to MetaCart
Weak Oblivious Transfer (WOT), where both the sender and receiver may learn too much about the other party’s input, and a new, more realistic model of noisy channels, called unfair noisy channels. An unfair noisy channel has a known range of possible noise levels; protocols must work for any level
JMLR: Workshop and Conference Proceedings 1–31 Minimax Number of Strata for Online Stratified Sampling given Noisy Samples
, 2012
"... We consider the problem of online stratified sampling for Monte Carlo integration of a function given a finite budget of n noisy evaluations to the function. More precisely we focus on the problem of choosing the number of strata K as a function of the budget n. We provide asymptotic and finitetime ..."
Abstract
 Add to MetaCart
We consider the problem of online stratified sampling for Monte Carlo integration of a function given a finite budget of n noisy evaluations to the function. More precisely we focus on the problem of choosing the number of strata K as a function of the budget n. We provide asymptotic and finite
Consistency of Functional Learning Methods Based on Derivatives
, 2011
"... In some real world applications, such as spectrometry, functional models achievebetterpredictiveperformancesiftheyworkonthederivativesoforder moftheir inputs rather thanontheoriginalfunctions. Asaconsequence, the use of derivatives is a common practice in functional data analysis, despite a lack of ..."
Abstract
 Add to MetaCart
of theoretical guarantees on the asymptotically achievable performances of a derivative based model. In this paper, we show that a smoothing spline approach can be used to preprocess multivariate observations obtained by sampling functions on a discrete and finite sampling grid in a way that leads
Results 1  10
of
181