Results 1  10
of
19
Concentration inequalities
 Advanced Lectures in Machine Learning
, 2004
"... Abstract. Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical a ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
Abstract. Concentration inequalities deal with deviations of functions of independent random variables from their expectation. In the last decade new tools have been introduced making it possible to establish simple and powerful inequalities. These inequalities are at the heart of the mathematical analysis of various problems in machine learning and made it possible to derive new efficient algorithms. This text attempts to summarize some of the basic tools. 1
Concentration
, 1998
"... Upper bounds on probabilities of large deviations for sums of bounded independent random variables may be extended to handle functions which depend in a limited way on a number of independent random variables. This ‘method of bounded differences’ has over the last dozen or so years had a great impac ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
Upper bounds on probabilities of large deviations for sums of bounded independent random variables may be extended to handle functions which depend in a limited way on a number of independent random variables. This ‘method of bounded differences’ has over the last dozen or so years had a great impact in probabilistic methods in discrete mathematics and in the mathematics of operational research and theoretical computer science. Recently Talagrand introduced an exciting new method for bounding probabilities of large deviations, which often proves superior to the bounded differences approach. In this paper we
Pattern classification and learning theory
"... 1.1 A binary classification problem Pattern recognition (or classification or discrimination) is about guessing or predicting the unknown class of an observation. An observation is a collection of numerical measurements, represented by a ddimensional vector x. The unknown nature of the observation ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
1.1 A binary classification problem Pattern recognition (or classification or discrimination) is about guessing or predicting the unknown class of an observation. An observation is a collection of numerical measurements, represented by a ddimensional vector x. The unknown nature of the observation is called a class. It is denoted by y and takes values in the set f0; 1g. (For simplicity, we restrict our attention to binary classification.) In pattern recognition, one creates a function g(x) : R d! f0; 1g which represents one's guess of y given x. The mapping g is called a classifier. A classifier errs on x if g(x) 6 = y. To model the learning problem, we introduce a probabilistic setting, and let (X; Y) be an R d \Theta f0; 1gvalued random pair. The random pair (X; Y) may be described in a variety of ways: for example, it is defined by the pair (_; j), where _ is the probability measure for X and j is the regression of Y on X. More precisely, for a Borelmeasurable set A ` R d
Statistical Learning Control of Uncertain Systems: It is better than it seems
, 1999
"... This paper answers the last question armatively, and does so byinvoking dierentversions of ..."
Abstract

Cited by 11 (9 self)
 Add to MetaCart
This paper answers the last question armatively, and does so byinvoking dierentversions of
Probabilistic analysis of the Traveling Salesman Problem
, 2000
"... Introduction In this chapter we study the HamiltonJan cycle and Traveling Salesman problem from a probabilistic point of view. Here we try to elucidate the properties of typical rather than worstcase examples. Structurally, one hopes to bring out the surprising properties of typical instances. Alg ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
Introduction In this chapter we study the HamiltonJan cycle and Traveling Salesman problem from a probabilistic point of view. Here we try to elucidate the properties of typical rather than worstcase examples. Structurally, one hopes to bring out the surprising properties of typical instances. Algorithmically the hope is that one can in some way explain the successful solution of large problems, much larger than that predicted by worstcase analysis. This of course raises the question of what do we mean by typical? The mathematical view of this is to define a probability space f of instances and study the expected properties of ce drawn from f with a given probability measure. Our discussion falls naturally into two parts: the independent case and the Euclidean case. The independent case will include a discussion of the existence of HamiltonJan cycles in various classes of random graphs and digraphs. We will then discuss algorithms for finding HamiltonJan cycles which are both fast
Analysis of Probabilistic Combinatorial Optimization Problems in Euclidean Spaces
, 1993
"... Probabilistic combinatorial optimization problems are generalized versions of deterministic combinatorial optimization problems with explicit inclusion of probabilistic elements in the problem definitions. Based on the probabilistic traveling salesman problem (PTSP) and on the probabilistic minimum ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Probabilistic combinatorial optimization problems are generalized versions of deterministic combinatorial optimization problems with explicit inclusion of probabilistic elements in the problem definitions. Based on the probabilistic traveling salesman problem (PTSP) and on the probabilistic minimum spanning tree problem (PMSTP), the objective of this paper is to give a rigorous treatment of the probabilistic analysis of these problems in the plane. More specifically we present general finitesize bounds and limit theorems for the objective functions of the PTSP and PMSTP. We also discuss the practical implications of these results and indicate some open problems. Appeared in MATHEMATICS OF OPERATIONS RESEARCH, 18, 5171 (1993) 1 Introduction During the last decade combinatorial optimization has undoubtedly been one of the fastest growing and most exciting areas in the field of discrete mathematics. Needless to say, the related scientific literature has been expanding at a very r...
On Properties of Geometric Random Problems in the Plane
, 1994
"... In this paper we present results dealing with properties of wellknown geometric random problems in the plane, together with their motivations. The paper specifically concentrates on the traveling salesman and minimum spanning tree problems, even though most of the results apply to other problems su ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper we present results dealing with properties of wellknown geometric random problems in the plane, together with their motivations. The paper specifically concentrates on the traveling salesman and minimum spanning tree problems, even though most of the results apply to other problems such as the Steiner tree problem and the minimum weight matching problem. Keywords: Traveling Salesman, Minimum Spanning Tree, Euclidean and Probabilistic Analyses, Rates of Convergence, Geometric Constants Appeared in Annals of Operations Research, 61, 120, (1995) y MSIS Department, The University of Texas at Austin, and Laboratoire de Math'ematiques Appliqu'ees, ENPC, France. 1 Overview and Motivations In Beardwood et al. [3], the authors prove that for any bounded uniform i.i.d. random variables fX i : 1 i ! 1g with values in R 2 , the length of the shortest tour through fX 1 ; : : : ; Xng is asymptotic to a constant times p n with probability one (the same being true in expe...