Results 11  20
of
283,371
WorstCase Analysis of Selective Sampling for Linear Classification
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2006
"... A selective sampling algorithm is a learning algorithm for classification that, based on the past observed data, decides whether to ask the label of each new instance to be classified. In this paper, we introduce a general technique for turning linearthreshold classification algorithms from the ..."
Abstract

Cited by 52 (6 self)
 Add to MetaCart
A selective sampling algorithm is a learning algorithm for classification that, based on the past observed data, decides whether to ask the label of each new instance to be classified. In this paper, we introduce a general technique for turning linearthreshold classification algorithms from the general additive family into randomized selective sampling algorithms. For the most popular algorithms in this family we derive mistake bounds that hold for individual sequences of examples. These bounds
WorstCase Analysis of the Perceptron and Exponentiated Update Algorithms
 Artificial Intelligence
, 1998
"... The absolute loss is the absolute difference between the desired and predicted outcome. This paper demonstrates worstcase upper bounds on the absolute loss for the Perceptron learning algorithm and the Exponentiated Update learning algorithm, which is related to the Weighted Majority algorithm. The ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
The absolute loss is the absolute difference between the desired and predicted outcome. This paper demonstrates worstcase upper bounds on the absolute loss for the Perceptron learning algorithm and the Exponentiated Update learning algorithm, which is related to the Weighted Majority algorithm
On the Worstcase Analysis of Temporaldifference Learning Algorithms
 MACHINE LEARNING
, 1994
"... ..."
Worstcase analysis of finitetime control policies
 IEEE Trans. Control Syst. Technol
, 2001
"... AbstractFinitetime control policies are common in batch and semibatch operations. A novel approach is proposed that quantifies the impact of parameter and control implementation inaccuracies on the performance of such control policies. This information can be used to decide whether more experimen ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
AbstractFinitetime control policies are common in batch and semibatch operations. A novel approach is proposed that quantifies the impact of parameter and control implementation inaccuracies on the performance of such control policies. This information can be used to decide whether more experiments are needed to produce parameter estimates of higher accuracy, or to define performance objectives for the lower level control loops that implement the control trajectory. The approach is evaluated through application to the multidimensional growth of crystals used in nonlinear optics applications, where the nominal parameters and uncertainties are quantified from experimental data. Robustness estimates are provided with reasonable computational requirements.
WorstCase Analysis of Scheduling Heuristics of Parallel Systems
 PARALLEL COMPUTING
, 1995
"... It is wellknown that most scheduling problems arising from parallel systems are NPhard, even under very special assumptions. Thus various suboptimal algorithms, in particular heuristics, were proposed in the literature. Worstcase error bounds are established in this paper for heuristics of makesp ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
It is wellknown that most scheduling problems arising from parallel systems are NPhard, even under very special assumptions. Thus various suboptimal algorithms, in particular heuristics, were proposed in the literature. Worstcase error bounds are established in this paper for heuristics
Worstcase Analysis of NonCooperative Load Balancing
, 2012
"... We investigate the impact of heterogeneity in the amount of incoming traffic routed by dispatchers in a noncooperative load balancing game. For a fixed amount of total incoming traffic, we give sufficient conditions on the cost function under which the worstcase social cost occurs when each dispat ..."
Abstract
 Add to MetaCart
We investigate the impact of heterogeneity in the amount of incoming traffic routed by dispatchers in a noncooperative load balancing game. For a fixed amount of total incoming traffic, we give sufficient conditions on the cost function under which the worstcase social cost occurs when each
1Network Compression: WorstCase Analysis
"... We study the problem of communicating a distributed correlated memoryless source over a memoryless network, from source nodes to destination nodes, under quadratic distortion constraints. We establish the following two complementary results: (a) for an arbitrary memoryless network, among all distrib ..."
Abstract
 Add to MetaCart
We study the problem of communicating a distributed correlated memoryless source over a memoryless network, from source nodes to destination nodes, under quadratic distortion constraints. We establish the following two complementary results: (a) for an arbitrary memoryless network, among all distributed memoryless sources of a given correlation, Gaussian sources are least compressible, that is, they admit the smallest set of achievable distortion tuples, and (b) for any memoryless source to be communicated over a memoryless additivenoise network, among all noise processes of a given correlation, Gaussian noise admits the smallest achievable set of distortion tuples. We establish these results constructively by showing how schemes for the corresponding Gaussian problems can be applied to achieve similar performance for (source or noise) distributions that are not necessarily Gaussian but have the same covariance. I.
Beyond Worstcase Analysis for Joins with Minesweeper
, 2014
"... We describe a new algorithm, Minesweeper, that is able to satisfy stronger runtime guarantees than previous join algorithms (colloquially, ‘beyond worstcase guarantees’) for data in indexed search trees. Our first contribution is developing a framework to measure this stronger notion of complexity, ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We describe a new algorithm, Minesweeper, that is able to satisfy stronger runtime guarantees than previous join algorithms (colloquially, ‘beyond worstcase guarantees’) for data in indexed search trees. Our first contribution is developing a framework to measure this stronger notion of complexity
Scalar vs. Vector Quantization: WorstCase Analysis
, 2002
"... We study the potential merits of vector quantization and show that there can be an arbitrary discrepancy between the worstcase rates required for scalar and vector quantization. Specifically, we describe a random variable and a distortion measure where quantization of a single instance to within ..."
Abstract
 Add to MetaCart
We study the potential merits of vector quantization and show that there can be an arbitrary discrepancy between the worstcase rates required for scalar and vector quantization. Specifically, we describe a random variable and a distortion measure where quantization of a single instance
Local Search Algorithms for SAT: WorstCase Analysis
 In: Proceedings of the 6th Scandinavian Workshop on Algorithm Theory, LNCS 1432
, 1998
"... Recent experiments demonstrated that local search algorithms (e.g. GSAT) are able to find satisfying assignments for many "hard" Boolean formulas. However, no nontrivial worstcase upper bounds were proved, although many such bounds of the form 2 ffn (ff ! 1 is a constant) are known for ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Recent experiments demonstrated that local search algorithms (e.g. GSAT) are able to find satisfying assignments for many "hard" Boolean formulas. However, no nontrivial worstcase upper bounds were proved, although many such bounds of the form 2 ffn (ff ! 1 is a constant) are known
Results 11  20
of
283,371