Results 1  10
of
108
The Convex Geometry of Linear Inverse Problems
, 2010
"... In applications throughout science and engineering one is often faced with the challenge of solving an illposed inverse problem, where the number of available measurements is smaller than the dimension of the model to be estimated. However in many practical situations of interest, models are constr ..."
Abstract

Cited by 43 (11 self)
 Add to MetaCart
In applications throughout science and engineering one is often faced with the challenge of solving an illposed inverse problem, where the number of available measurements is smaller than the dimension of the model to be estimated. However in many practical situations of interest, models are constrained structurally so that they only have a few degrees of freedom relative to their ambient dimension. This paper provides a general framework to convert notions of simplicity into convex penalty functions, resulting in convex optimization solutions to linear, underdetermined inverse problems. The class of simple models considered are those formed as the sum of a few atoms from some (possibly infinite) elementary atomic set; examples include wellstudied cases such as sparse vectors (e.g., signal processing, statistics) and lowrank matrices (e.g., control, statistics), as well as several others including sums of a few permutations matrices (e.g., ranked elections, multiobject tracking), lowrank tensors (e.g., computer vision, neuroscience), orthogonal matrices (e.g., machine learning), and atomic measures (e.g., system identification). The convex programming formulation is based on minimizing the norm induced by the convex hull of the atomic set; this norm is referred to as the atomic norm. The facial
Bounds on packings of spheres in the Grassmann manifolds
, 2000
"... We derive the VarshamovGilbert and Hamming bounds for packings of spheres (codes) in the Grassmann manifolds over $\mathbb R$ and $\mathbb C$. The distance between two $k$planes is defined as $\rho(p,q)=(\sin^2\theta_1 \dots \sin^2\theta_k)^{1/2}$, where $\theta_i, 1\le i\le k$, are the principal ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
We derive the VarshamovGilbert and Hamming bounds for packings of spheres (codes) in the Grassmann manifolds over $\mathbb R$ and $\mathbb C$. The distance between two $k$planes is defined as $\rho(p,q)=(\sin^2\theta_1 \dots \sin^2\theta_k)^{1/2}$, where $\theta_i, 1\le i\le k$, are the principal angles between $p$ and $q$.
Lattice duality: The origin of probability and entropy
 In press: Neurocomputing
, 2005
"... Bayesian probability theory is an inference calculus, which originates from a generalization of inclusion on the Boolean lattice of logical assertions to a degree of inclusion represented by a real number. Dual to this lattice is the distributive lattice of questions constructed from the ordered set ..."
Abstract

Cited by 19 (6 self)
 Add to MetaCart
Bayesian probability theory is an inference calculus, which originates from a generalization of inclusion on the Boolean lattice of logical assertions to a degree of inclusion represented by a real number. Dual to this lattice is the distributive lattice of questions constructed from the ordered set of downsets of assertions, which forms the foundation of the calculus of inquiry—a generalization of information theory. In this paper we introduce this novel perspective on these spaces in which machine learning is performed and discuss the relationship between these results and several proposed generalizations of information theory in the literature.
Target enumeration via Euler characteristic integrals I: sensor fields
, 2007
"... We solve the problem of counting the total number of observable targets (e.g., persons, vehicles, landmarks) in a region using local counts performed by a dense field of sensors, each of which measures the number of targets nearby but not their identities nor any positional information. We formulat ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
We solve the problem of counting the total number of observable targets (e.g., persons, vehicles, landmarks) in a region using local counts performed by a dense field of sensors, each of which measures the number of targets nearby but not their identities nor any positional information. We formulate and solve several such problems based on the types of sensors and mobility of the targets. The main contribution of this paper is the adaptation of a topological integration theory — integration with respect to Euler characteristic — to yield complete solutions to these problems.
Classical 6jsymbols and the tetrahedron
 Geometry and Topology
, 1999
"... Abstract. A classical 6jsymbol is a real number which can be associated to a labelling of the six edges of a tetrahedron by irreducible representations of SU(2). This abstract association is traditionally used simply to express the symmetry of the 6jsymbol, which is a purely algebraic object; howe ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Abstract. A classical 6jsymbol is a real number which can be associated to a labelling of the six edges of a tetrahedron by irreducible representations of SU(2). This abstract association is traditionally used simply to express the symmetry of the 6jsymbol, which is a purely algebraic object; however, it has a deeper geometric significance. Ponzano and Regge, expanding on work of Wigner, gave a striking (but unproved) asymptotic formula relating the value of the 6jsymbol, when the dimensions of the representations are large, to the volume of an honest Euclidean tetrahedron whose edge lengths are these dimensions. The goal of this paper is to prove and explain this formula by using geometric quantization. A surprising spinoff is that a generic Euclidean tetrahedron gives rise to a family of twelve scissorscongruent but noncongruent tetrahedra. 1.
Statistical analysis of largescale structure in the Universe
 Statistical Physics and Spatial Satistics: The Art of Analyzing and Modeling Spatial Structures and Pattern Formation. Lecture Notes in Physics 554
, 2000
"... Methods for the statistical characterization of the large–scale structure in the Universe will be the main topic of the present text. The focus is on geometrical methods, mainly Minkowski functionals and the J–function. Their relations to standard methods used in cosmology and spatial statistics and ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Methods for the statistical characterization of the large–scale structure in the Universe will be the main topic of the present text. The focus is on geometrical methods, mainly Minkowski functionals and the J–function. Their relations to standard methods used in cosmology and spatial statistics and their application to cosmological datasets will be discussed. A short introduction to the standard picture of cosmology is given. 1