Results 11  20
of
237,415
Sampling signals with finite rate of innovation
 IEEE Transactions on Signal Processing
, 2002
"... Abstract—Consider classes of signals that have a finite number of degrees of freedom per unit of time and call this number the rate of innovation. Examples of signals with a finite rate of innovation include streams of Diracs (e.g., the Poisson process), nonuniform splines, and piecewise polynomials ..."
Abstract

Cited by 350 (67 self)
 Add to MetaCart
polynomials. Even though these signals are not bandlimited, we show that they can be sampled uniformly at (or above) the rate of innovation using an appropriate kernel and then be perfectly reconstructed. Thus, we prove sampling theorems for classes of signals and kernels that generalize the classic
Cones of matrices and setfunctions and 01 optimization
 SIAM JOURNAL ON OPTIMIZATION
, 1991
"... It has been recognized recently that to represent a polyhedron as the projection of a higher dimensional, but simpler, polyhedron, is a powerful tool in polyhedral combinatorics. We develop a general method to construct higherdimensional polyhedra (or, in some cases, convex sets) whose projection a ..."
Abstract

Cited by 347 (7 self)
 Add to MetaCart
of inequalities, such that already the first system includes clique, odd hole, odd antihole, wheel, and orthogonality constraints. In particular, for perfect (and many other) graphs, this first system gives the vertex packing polytope. For various classes of graphs, including tperfect graphs, it follows
Approximating the permanent
 SIAM J. Computing
, 1989
"... Abstract. A randomised approximation scheme for the permanent of a 01 matrix is presented. The task of estimating a permanent is reduced to that of almost uniformly generating perfect matchings in a graph; the latter is accomplished by simulating a Markov chain whose states are the matchings in the ..."
Abstract

Cited by 345 (26 self)
 Add to MetaCart
Abstract. A randomised approximation scheme for the permanent of a 01 matrix is presented. The task of estimating a permanent is reduced to that of almost uniformly generating perfect matchings in a graph; the latter is accomplished by simulating a Markov chain whose states are the matchings
On kernelperfect orientations of line graphs
, 1998
"... We exploit the technique of Galvin (1995) to prove that an orientation D of a linegraph G (of a multigraph) is kernelperfect if and only if every oriented odd cycle in D has a chord (or pseudochord) and every clique has a kernel. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We exploit the technique of Galvin (1995) to prove that an orientation D of a linegraph G (of a multigraph) is kernelperfect if and only if every oriented odd cycle in D has a chord (or pseudochord) and every clique has a kernel.
Diffusion kernels on graphs and other discrete input spaces
 in: Proceedings of the 19th International Conference on Machine Learning
, 2002
"... The application of kernelbased learning algorithms has, so far, largely been confined to realvalued data and a few special data types, such as strings. In this paper we propose a general method of constructing natural families of kernels over discrete structures, based on the matrix exponentiation ..."
Abstract

Cited by 223 (5 self)
 Add to MetaCart
idea. In particular, we focus on generating kernels on graphs, for which we propose a special class of exponential kernels called diffusion kernels, which are based on the heat equation and can be regarded as the discretization of the familiar Gaussian kernel of Euclidean space.
Marginalized kernels between labeled graphs
 Proceedings of the Twentieth International Conference on Machine Learning
, 2003
"... A new kernel function between two labeled graphs is presented. Feature vectors are defined as the counts of label paths produced by random walks on graphs. The kernel computation finally boils down to obtaining the stationary state of a discretetime linear system, thus is efficiently performed by s ..."
Abstract

Cited by 194 (15 self)
 Add to MetaCart
A new kernel function between two labeled graphs is presented. Feature vectors are defined as the counts of label paths produced by random walks on graphs. The kernel computation finally boils down to obtaining the stationary state of a discretetime linear system, thus is efficiently performed
Perfect
, 2014
"... simulation, monotonicity and finite queueing networks JeanMarc Vincent To cite this version: JeanMarc Vincent. Perfect simulation, monotonicity and finite queueing networks. QEST, ..."
Abstract
 Add to MetaCart
simulation, monotonicity and finite queueing networks JeanMarc Vincent To cite this version: JeanMarc Vincent. Perfect simulation, monotonicity and finite queueing networks. QEST,
Graph embedding and extension: A general framework for dimensionality reduction
 IEEE TRANS. PATTERN ANAL. MACH. INTELL
, 2007
"... Over the past few decades, a large family of algorithms—supervised or unsupervised; stemming from statistics or geometry theory—has been designed to provide different solutions to the problem of dimensionality reduction. Despite the different motivations of these algorithms, we present in this paper ..."
Abstract

Cited by 271 (29 self)
 Add to MetaCart
in this paper a general formulation known as graph embedding to unify them within a common framework. In graph embedding, each algorithm can be considered as the direct graph embedding or its linear/kernel/tensor extension of a specific intrinsic graph that describes certain desired statistical or geometric
On graph kernels: Hardness results and efficient alternatives
 IN: CONFERENCE ON LEARNING THEORY
, 2003
"... As most ‘realworld’ data is structured, research in kernel methods has begun investigating kernels for various kinds of structured data. One of the most widely used tools for modeling structured data are graphs. An interesting and important challenge is thus to investigate kernels on instances tha ..."
Abstract

Cited by 184 (6 self)
 Add to MetaCart
As most ‘realworld’ data is structured, research in kernel methods has begun investigating kernels for various kinds of structured data. One of the most widely used tools for modeling structured data are graphs. An interesting and important challenge is thus to investigate kernels on instances
STSGraphs of Perfect Codes Mod Kernel
"... We show that a 1errorcorrecting code C is `foldable' over its kernel via the Steiner triple systems associated to the codewords whenever C is perfect. The resulting `folding' produces a graph invariant that for Vasil'ev codes of length 15 is complete, showing in particular that ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We show that a 1errorcorrecting code C is `foldable' over its kernel via the Steiner triple systems associated to the codewords whenever C is perfect. The resulting `folding' produces a graph invariant that for Vasil'ev codes of length 15 is complete, showing in particular
Results 11  20
of
237,415