Results 1  10
of
272,127
Errata for “Multidimensional Kruskal–Katona theorem”
, 2014
"... In the paper, it is asserted that the equality in Theorem 1 holds only if F is of the form (Y1r) × · · · × (Ydr) for some sets Y1,..., Yd ⊂ X. The claim is valid, but the proof is not. The error: A wrong claim appears in the introduction. It is claimed that “... equality holds only if F is an ini ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
is an initial segment of a such colexicographical order”. This is false, as shown independently by Füredi–Griggs[2] and by Mörs[5]. Those papers give a thorough treatment to the problem of uniqueness of the extremal family in the Kruskal–Katona theorem. In particular, the claim is true for families of size m
KKL, KruskalKatona, and monotone nets
, 2009
"... We generalize the KahnKalaiLinial (KKL) Theorem to random walks on Cayley and Schreier graphs, making progress on an open problem of Hoory, Linial, and Wigderson. In our generalization, the underlying group need not be abelian so long as the generating set is a union of conjugacy classes. An examp ..."
Abstract
 Add to MetaCart
of the KruskalKatona Theorem: Given a constantdensity subset A of a middle slice of the Hamming ncube, the density of ∂A is greater by at log n
A KRUSKALKATONA TYPE THEOREM FOR GRAPHS
, 2007
"... A bound on consecutive clique numbers of graphs is established. This bound is evaluated and shown to often be much better than the bound of the KruskalKatona theorem. A bound on nonconsecutive clique numbers is also proven. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
A bound on consecutive clique numbers of graphs is established. This bound is evaluated and shown to often be much better than the bound of the KruskalKatona theorem. A bound on nonconsecutive clique numbers is also proven.
Published In KKL, KruskalKatona, and monotone nets
, 2009
"... We generalize the KahnKalaiLinial (KKL) Theorem to random walks on Cayley and Schreier graphs, making progress on an open problem of Hoory, Linial, and Wigderson. In our generalization, the underlying group need not be abelian so long as the generating set is a union of conjugacy classes. An examp ..."
Abstract
 Add to MetaCart
We generalize the KahnKalaiLinial (KKL) Theorem to random walks on Cayley and Schreier graphs, making progress on an open problem of Hoory, Linial, and Wigderson. In our generalization, the underlying group need not be abelian so long as the generating set is a union of conjugacy classes
Convex Analysis
, 1970
"... In this book we aim to present, in a unified framework, a broad spectrum of mathematical theory that has grown in connection with the study of problems of optimization, equilibrium, control, and stability of linear and nonlinear systems. The title Variational Analysis reflects this breadth. For a lo ..."
Abstract

Cited by 5350 (67 self)
 Add to MetaCart
In this book we aim to present, in a unified framework, a broad spectrum of mathematical theory that has grown in connection with the study of problems of optimization, equilibrium, control, and stability of linear and nonlinear systems. The title Variational Analysis reflects this breadth. For a long time, ‘variational ’ problems have been identified mostly with the ‘calculus of variations’. In that venerable subject, built around the minimization of integral functionals, constraints were relatively simple and much of the focus was on infinitedimensional function spaces. A major theme was the exploration of variations around a point, within the bounds imposed by the constraints, in order to help characterize solutions and portray them in terms of ‘variational principles’. Notions of perturbation, approximation and even generalized differentiability were extensively investigated. Variational theory progressed also to the study of socalled stationary points, critical points, and other indications of singularity that a point might have relative to its neighbors, especially in association with existence theorems for differential equations.
Nonexistence Of A KruskalKatona Type Theorem For Subword Orders
, 1998
"... Introduction. Macaulay posets. Let P be a ranked poset with the associated partial order . Denote its ith level (the set of all elements of rank i) by N i (P ) resp. by N i if there is no danger of ambiguity. For x 2 N i the (down) shadow \Delta(x) of x is the set of all y 2 N i\Gamma1 such that ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
such that y x. The shadow of a subset X ` N i is the set \Delta(X ) = S x2X \Delta(x). Let OE be some linear ordering of the elements of P . For X ` N i , the set C(X) of the smallest jXj
Just Relax: Convex Programming Methods for Identifying Sparse Signals in Noise
, 2006
"... This paper studies a difficult and fundamental problem that arises throughout electrical engineering, applied mathematics, and statistics. Suppose that one forms a short linear combination of elementary signals drawn from a large, fixed collection. Given an observation of the linear combination that ..."
Abstract

Cited by 496 (2 self)
 Add to MetaCart
that convex relaxation succeeds. As evidence of the broad impact of these results, the paper describes how convex relaxation can be used for several concrete signal recovery problems. It also describes applications to channel coding, linear regression, and numerical analysis.
Distance Metric Learning, With Application To Clustering With SideInformation
 ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 15
, 2003
"... Many algorithms rely critically on being given a good metric over their inputs. For instance, data can often be clustered in many "plausible" ways, and if a clustering algorithm such as Kmeans initially fails to find one that is meaningful to a user, the only recourse may be for the us ..."
Abstract

Cited by 799 (14 self)
 Add to MetaCart
be for the user to manually tweak the metric until sufficiently good clusters are found. For these and other applications requiring good metrics, it is desirable that we provide a more systematic way for users to indicate what they consider "similar." For instance, we may ask them to provide
Training Support Vector Machines: an Application to Face Detection
, 1997
"... We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision sur ..."
Abstract

Cited by 728 (1 self)
 Add to MetaCart
We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision
A firstorder primaldual algorithm for convex problems with applications to imaging
, 2010
"... In this paper we study a firstorder primaldual algorithm for convex optimization problems with known saddlepoint structure. We prove convergence to a saddlepoint with rate O(1/N) in finite dimensions, which is optimal for the complete class of nonsmooth problems we are considering in this paper ..."
Abstract

Cited by 435 (20 self)
 Add to MetaCart
in this paper. We further show accelerations of the proposed algorithm to yield optimal rates on easier problems. In particular we show that we can achieve O(1/N 2) convergence on problems, where the primal or the dual objective is uniformly convex, and we can show linear convergence, i.e. O(1/e N) on problems
Results 1  10
of
272,127