## A Novel Optimizing Network Architecture with Applications (1996)

### Cached

### Download Links

Venue: | Neural Computation |

Citations: | 35 - 16 self |

### BibTeX

@ARTICLE{Rangarajan96anovel,

author = {Anand Rangarajan and Steven Gold and Eric Mjolsness},

title = {A Novel Optimizing Network Architecture with Applications},

journal = {Neural Computation},

year = {1996},

volume = {8},

pages = {1041--1060}

}

### OpenURL

### Abstract

We present a novel optimizing network architecture with applications in vision, learning, pattern recognition and combinatorial optimization. This architecture is constructed by combining the following techniques: (i) deterministic annealing, (ii) self-amplification, (iii) algebraic transformations, (iv) clocked objectives and (v) softassign. Deterministic annealing in conjunction with self-amplification avoids poor local minima and ensures that a vertex of the hypercube is reached. Algebraic transformations and clocked objectives help partition the relaxation into distinct phases. The problems considered have doubly stochastic matrix constraints or minor variations thereof. We introduce a new technique, softassign, which is used to satisfy this constraint. Experimental results on different problems are presented and discussed. 1