Results 1  10
of
10
Structure Spaces
, 2007
"... Finite structures such as point patterns, strings, trees, and graphs occur as ”natural” representations of structured data in different application areas of machine learning. We develop the theory of structure spaces and derive geometrical and analytical concepts such as the angle between structures ..."
Abstract

Cited by 10 (7 self)
 Add to MetaCart
(Show Context)
Finite structures such as point patterns, strings, trees, and graphs occur as ”natural” representations of structured data in different application areas of machine learning. We develop the theory of structure spaces and derive geometrical and analytical concepts such as the angle between structures and the derivative of functions on structures. In particular, we show that the gradient of a differentiable structural function is a welldefined structure pointing in the direction of steepest ascent. Exploiting the properties of structure spaces, it will turn out that a number of problems in structural pattern recognition such as central clustering or learning in structured output spaces can be formulated as optimization problems with cost functions that are locally Lipschitz. Hence, methods from nonsmooth analysis are applicable to optimize those cost functions.
Accelerating Competitive Learning Graph Quantization
 Computer Vision and Image Understanding
"... Abstract. Vector quantization(VQ) is a lossy data compression technique from signal processing for which simple competitive learning is one standard method to quantize patterns from the input space. Extending competitive learning VQ to the domain of graphs results in competitive learning for quanti ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Vector quantization(VQ) is a lossy data compression technique from signal processing for which simple competitive learning is one standard method to quantize patterns from the input space. Extending competitive learning VQ to the domain of graphs results in competitive learning for quantizing input graphs. In this contribution, we propose an accelerated version of competitive learning graph quantization (GQ) without trading computational time against solution quality. For this, we lift graphs locally to vectors in order to avoid unnecessary calculations of intractable graph distances. In doing so, the accelerated version of competitive learning GQ gradually turns locally into a competitive learning VQ with increasing number of iterations. Empirical results show a significant speedup by maintaining a comparable solution quality. 1
Datum der Disputation:
, 2011
"... The threedimensional structure of a protein is determined by the network of covalent and noncovalent interactions. An exact description of the governing forces requires to take into account quantum effects which makes the system too complex for most computational analyses. Depending on the applica ..."
Abstract
 Add to MetaCart
(Show Context)
The threedimensional structure of a protein is determined by the network of covalent and noncovalent interactions. An exact description of the governing forces requires to take into account quantum effects which makes the system too complex for most computational analyses. Depending on the application, a simpler representation can make such analyses feasible while still capturing the relevant aspects of the system. Here we explore a number of applications which are based on representing a protein as a graph of interacting residues. This representation has some conceptual advantages over an allatom representation. It can be shown that the graph representation captures the three dimensional information up to an average accuracy of 2 Ångstrom Calpha RMSD. The deviation stems from the fact that the network is equivalent to an ensemble of structures which satisfy the contact constraints. This can be used to represent some degree of flexibility. Furthermore, the representation makes it possible to apply algorithms from graph theory to common protein analysis problems such as structure alignment and structure prediction.
Computing the Barycenter Graph by means of the Graph Edit Distance
"... The barycenter graph has been shown as an alternative to obtain the representative of a given set of graphs. In this paper we propose an extension of the original algorithm which makes use of the graph edit distance in conjunction with the weighted mean of a pair of graphs. Our main contribution i ..."
Abstract
 Add to MetaCart
(Show Context)
The barycenter graph has been shown as an alternative to obtain the representative of a given set of graphs. In this paper we propose an extension of the original algorithm which makes use of the graph edit distance in conjunction with the weighted mean of a pair of graphs. Our main contribution is that we can apply the method to attributed graphs with any kind of labels in both the nodes and the edges, equipped with a distance function less constrained than in previous approaches. Experiments done on four different datasets support the validity of the method giving good approximations of the barycenter graph. 1.
Set of Graphs: Median vs Barycenter Graph
"... A comparison between two representatives of a ..."
(Show Context)
Elkan's kMeans for Graphs
"... Abstract. This paper extends kmeans algorithms from the Euclidean domain to the domain of graphs. To recompute the centroids, we apply subgradient methods for solving the optimizationbased formulation of the sample mean of graphs. To accelerate the kmeans algorithm for graphs without trading comp ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. This paper extends kmeans algorithms from the Euclidean domain to the domain of graphs. To recompute the centroids, we apply subgradient methods for solving the optimizationbased formulation of the sample mean of graphs. To accelerate the kmeans algorithm for graphs without trading computational time against solution quality, we avoid unnecessary graph distance calculations by exploiting the triangle inequality of the underlying distance metric following Elkan's kmeans algorithm proposed in [5]. In experiments we show that the accelerated kmeans algorithm are faster than the standard kmeans algorithm for graphs provided there is a cluster structure in the data. 1
Graph Quantization
"... Abstract. Vector quantization(VQ) is a lossy data compression technique from signal processing, which is restricted to feature vectors and therefore inapplicable for combinatorial structures. This contribution presents a theoretical foundation of graph quantization (GQ) that extends VQ to the doma ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Vector quantization(VQ) is a lossy data compression technique from signal processing, which is restricted to feature vectors and therefore inapplicable for combinatorial structures. This contribution presents a theoretical foundation of graph quantization (GQ) that extends VQ to the domain of attributed graphs. We present the necessary LloydMax conditions for optimality of a graph quantizer and consistency results for optimal GQ design based on empirical distortion measures and stochastic optimization. These results statistically justify existing clustering algorithms in the domain of graphs. The proposed approach provides a template of how to link structural pattern recognition methods other than GQ to statistical pattern recognition. 1
Set of Graphs: Median vs Barycenter Graph
"... A comparison between two representatives of a ..."
(Show Context)