Results 1  10
of
13
Concentration inequalities
 Advanced Lectures in Machine Learning
, 2004
"... Abstract. Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical a ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
Abstract. Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical analysis of various problems in machine learning and made it possible to derive new efficient algorithms. This text attempts to summarize some of the basic tools. 1
Pattern classification and learning theory
"... 1.1 A binary classification problem Pattern recognition (or classification or discrimination) is about guessing or predicting the unknown class of an observation. An observation is a collection of numerical measurements, represented by a ddimensional vector x. The unknown nature of the observation ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
1.1 A binary classification problem Pattern recognition (or classification or discrimination) is about guessing or predicting the unknown class of an observation. An observation is a collection of numerical measurements, represented by a ddimensional vector x. The unknown nature of the observation is called a class. It is denoted by y and takes values in the set f0; 1g. (For simplicity, we restrict our attention to binary classification.) In pattern recognition, one creates a function g(x) : R d! f0; 1g which represents one's guess of y given x. The mapping g is called a classifier. A classifier errs on x if g(x) 6 = y. To model the learning problem, we introduce a probabilistic setting, and let (X; Y) be an R d \Theta f0; 1gvalued random pair. The random pair (X; Y) may be described in a variety of ways: for example, it is defined by the pair (_; j), where _ is the probability measure for X and j is the regression of Y on X. More precisely, for a Borelmeasurable set A ` R d
Concentration
, 1998
"... Upper bounds on probabilities of large deviations for sums of bounded independent random variables may be extended to handle functions which depend in a limited way on a number of independent random variables. This ‘method of bounded differences’ has over the last dozen or so years had a great impac ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
Upper bounds on probabilities of large deviations for sums of bounded independent random variables may be extended to handle functions which depend in a limited way on a number of independent random variables. This ‘method of bounded differences’ has over the last dozen or so years had a great impact in probabilistic methods in discrete mathematics and in the mathematics of operational research and theoretical computer science. Recently Talagrand introduced an exciting new method for bounding probabilities of large deviations, which often proves superior to the bounded differences approach. In this paper we
Statistical Learning Control of Uncertain Systems: It is better than it seems
, 1999
"... This paper answers the last question armatively, and does so byinvoking dierentversions of ..."
Abstract

Cited by 11 (9 self)
 Add to MetaCart
This paper answers the last question armatively, and does so byinvoking dierentversions of
Analysis of Probabilistic Combinatorial Optimization Problems in Euclidean Spaces
, 1993
"... Probabilistic combinatorial optimization problems are generalized versions of deterministic combinatorial optimization problems with explicit inclusion of probabilistic elements in the problem definitions. Based on the probabilistic traveling salesman problem (PTSP) and on the probabilistic minimum ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Probabilistic combinatorial optimization problems are generalized versions of deterministic combinatorial optimization problems with explicit inclusion of probabilistic elements in the problem definitions. Based on the probabilistic traveling salesman problem (PTSP) and on the probabilistic minimum spanning tree problem (PMSTP), the objective of this paper is to give a rigorous treatment of the probabilistic analysis of these problems in the plane. More specifically we present general finitesize bounds and limit theorems for the objective functions of the PTSP and PMSTP. We also discuss the practical implications of these results and indicate some open problems. Appeared in MATHEMATICS OF OPERATIONS RESEARCH, 18, 5171 (1993) 1 Introduction During the last decade combinatorial optimization has undoubtedly been one of the fastest growing and most exciting areas in the field of discrete mathematics. Needless to say, the related scientific literature has been expanding at a very r...
Probabilistic analysis of the Traveling Salesman Problem
, 2000
"... Introduction In this chapter we study the HamiltonJan cycle and Traveling Salesman problem from a probabilistic point of view. Here we try to elucidate the properties of typical rather than worstcase examples. Structurally, one hopes to bring out the surprising properties of typical instances. Alg ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Introduction In this chapter we study the HamiltonJan cycle and Traveling Salesman problem from a probabilistic point of view. Here we try to elucidate the properties of typical rather than worstcase examples. Structurally, one hopes to bring out the surprising properties of typical instances. Algorithmically the hope is that one can in some way explain the successful solution of large problems, much larger than that predicted by worstcase analysis. This of course raises the question of what do we mean by typical? The mathematical view of this is to define a probability space f of instances and study the expected properties of ce drawn from f with a given probability measure. Our discussion falls naturally into two parts: the independent case and the Euclidean case. The independent case will include a discussion of the existence of HamiltonJan cycles in various classes of random graphs and digraphs. We will then discuss algorithms for finding HamiltonJan cycles which are both fast
Probabilistic Analysis of MultiItem Capacitated Lot Sizing Problems
, 2005
"... This paper conducts a probabilistic analysis of an important class of heuristics for multiitem capacitated lot sizing problems. We characterize the asymptotic performance of socalled progressive interval heuristics as T, the length of the planning horizon, goes to infinity, assuming the data are re ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper conducts a probabilistic analysis of an important class of heuristics for multiitem capacitated lot sizing problems. We characterize the asymptotic performance of socalled progressive interval heuristics as T, the length of the planning horizon, goes to infinity, assuming the data are realizations of a stochastic process of the following type: the vector of cost parameters follows an arbitrary process with bounded support, while the sequence of aggregate demand and capacity pairs is generated as an independent sequence with a common general bivariate distribution, which may be of unbounded support. We show that important subclasses of the class of progressive interval heuristics can be designed to be asymptotically optimal with probability one, while running with a complexity bound which grows linearly with the number of items N and slightly faster than quadratically with T. We generalize our results for the case where the items ’ shelf life is uniformly bounded, This paper conducts a probabilistic analysis of an important class of heuristics for multiitem