Results 1  10
of
15,885
On Spectral Clustering: Analysis and an algorithm
 ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS
, 2001
"... Despite many empirical successes of spectral clustering methods  algorithms that cluster points using eigenvectors of matrices derived from the distances between the points  there are several unresolved issues. First, there is a wide variety of algorithms that use the eigenvectors in slightly ..."
Abstract

Cited by 1713 (13 self)
 Add to MetaCart
Despite many empirical successes of spectral clustering methods  algorithms that cluster points using eigenvectors of matrices derived from the distances between the points  there are several unresolved issues. First, there is a wide variety of algorithms that use the eigenvectors
Laplacian eigenmaps and spectral techniques for embedding and clustering.
 Proceeding of Neural Information Processing Systems,
, 2001
"... Abstract Drawing on the correspondence between the graph Laplacian, the LaplaceBeltrami op erator on a manifold , and the connections to the heat equation , we propose a geometrically motivated algorithm for constructing a representation for data sampled from a low dimensional manifold embedded in ..."
Abstract

Cited by 668 (7 self)
 Add to MetaCart
of the manifold on which the data may possibly reside. Recently, there has been some interest (Tenenbaum et aI, 2000 ; The core algorithm is very simple, has a few local computations and one sparse eigenvalu e problem. The solution reflects th e intrinsic geom etric structure of the manifold. Th e justification
Multiobjective Optimization Using Nondominated Sorting in Genetic Algorithms
 Evolutionary Computation
, 1994
"... In trying to solve multiobjective optimization problems, many traditional methods scalarize the objective vector into a single objective. In those cases, the obtained solution is highly sensitive to the weight vector used in the scalarization process and demands the user to have knowledge about t ..."
Abstract

Cited by 539 (5 self)
 Add to MetaCart
In trying to solve multiobjective optimization problems, many traditional methods scalarize the objective vector into a single objective. In those cases, the obtained solution is highly sensitive to the weight vector used in the scalarization process and demands the user to have knowledge about
Progressive Meshes
"... Highly detailed geometric models are rapidly becoming commonplace in computer graphics. These models, often represented as complex triangle meshes, challenge rendering performance, transmission bandwidth, and storage capacities. This paper introduces the progressive mesh (PM) representation, a new s ..."
Abstract

Cited by 1315 (11 self)
 Add to MetaCart
scheme for storing and transmitting arbitrary triangle meshes. This efficient, lossless, continuousresolution representation addresses several practical problems in graphics: smooth geomorphing of levelofdetail approximations, progressive transmission, mesh compression, and selective refinement
Finding community structure in networks using the eigenvectors of matrices
, 2006
"... We consider the problem of detecting communities or modules in networks, groups of vertices with a higherthanaverage density of edges connecting them. Previous work indicates that a robust approach to this problem is the maximization of the benefit function known as “modularity ” over possible div ..."
Abstract

Cited by 502 (0 self)
 Add to MetaCart
We consider the problem of detecting communities or modules in networks, groups of vertices with a higherthanaverage density of edges connecting them. Previous work indicates that a robust approach to this problem is the maximization of the benefit function known as “modularity ” over possible
Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ¹ minimization
 PROC. NATL ACAD. SCI. USA 100 2197–202
, 2002
"... Given a ‘dictionary’ D = {dk} of vectors dk, we seek to represent a signal S as a linear combination S = ∑ k γ(k)dk, with scalar coefficients γ(k). In particular, we aim for the sparsest representation possible. In general, this requires a combinatorial optimization process. Previous work considered ..."
Abstract

Cited by 633 (38 self)
 Add to MetaCart
Given a ‘dictionary’ D = {dk} of vectors dk, we seek to represent a signal S as a linear combination S = ∑ k γ(k)dk, with scalar coefficients γ(k). In particular, we aim for the sparsest representation possible. In general, this requires a combinatorial optimization process. Previous work
Secure spread spectrum watermarking for multimedia
 IEEE TRANSACTIONS ON IMAGE PROCESSING
, 1997
"... This paper presents a secure (tamperresistant) algorithm for watermarking images, and a methodology for digital watermarking that may be generalized to audio, video, and multimedia data. We advocate that a watermark should be constructed as an independent and identically distributed (i.i.d.) Gauss ..."
Abstract

Cited by 1100 (10 self)
 Add to MetaCart
.i.d.) Gaussian random vector that is imperceptibly inserted in a spreadspectrumlike fashion into the perceptually most significant spectral components of the data. We argue that insertion of a watermark under this regime makes the watermark robust to signal processing operations (such as lossy compression
Consensus and cooperation in networked multiagent systems
 Proceedings of the IEEE
, 2007
"... Summary. This paper provides a theoretical framework for analysis of consensus algorithms for multiagent networked systems with an emphasis on the role of directed information flow, robustness to changes in network topology due to link/node failures, timedelays, and performance guarantees. An ove ..."
Abstract

Cited by 807 (4 self)
 Add to MetaCart
. An overview of basic concepts of information consensus in networks and methods of convergence and performance analysis for the algorithms are provided. Our analysis framework is based on tools from matrix theory, algebraic graph theory, and control theory. We discuss the connections between consensus problems
The Dantzig selector: statistical estimation when p is much larger than n
, 2005
"... In many important statistical applications, the number of variables or parameters p is much larger than the number of observations n. Suppose then that we have observations y = Ax + z, where x ∈ R p is a parameter vector of interest, A is a data matrix with possibly far fewer rows than columns, n ≪ ..."
Abstract

Cited by 879 (14 self)
 Add to MetaCart
≪ p, and the zi’s are i.i.d. N(0, σ 2). Is it possible to estimate x reliably based on the noisy data y? To estimate x, we introduce a new estimator—we call the Dantzig selector—which is solution to the ℓ1regularization problem min ˜x∈R p ‖˜x‖ℓ1 subject to ‖A T r‖ℓ ∞ ≤ (1 + t −1) √ 2 log p · σ
Methodologies in spectral analysis of large dimensional random matrices, a review
 STATIST. SINICA
, 1999
"... In this paper, we give a brief review of the theory of spectral analysis of large dimensional random matrices. Most of the existing work in the literature has been stated for real matrices but the corresponding results for the complex case are also of interest, especially for researchers in Electri ..."
Abstract

Cited by 454 (39 self)
 Add to MetaCart
In this paper, we give a brief review of the theory of spectral analysis of large dimensional random matrices. Most of the existing work in the literature has been stated for real matrices but the corresponding results for the complex case are also of interest, especially for researchers
Results 1  10
of
15,885