Results 1  10
of
150,507
A fast and high quality multilevel scheme for partitioning irregular graphs
 SIAM JOURNAL ON SCIENTIFIC COMPUTING
, 1998
"... Recently, a number of researchers have investigated a class of graph partitioning algorithms that reduce the size of the graph by collapsing vertices and edges, partition the smaller graph, and then uncoarsen it to construct a partition for the original graph [Bui and Jones, Proc. ..."
Abstract

Cited by 1189 (15 self)
 Add to MetaCart
Recently, a number of researchers have investigated a class of graph partitioning algorithms that reduce the size of the graph by collapsing vertices and edges, partition the smaller graph, and then uncoarsen it to construct a partition for the original graph [Bui and Jones, Proc.
Image Quality Assessment: From Error Visibility to Structural Similarity
 IEEE TRANSACTIONS ON IMAGE PROCESSING
, 2004
"... Objective methods for assessing perceptual image quality have traditionally attempted to quantify the visibility of errors between a distorted image and a reference image using a variety of known properties of the human visual system. Under the assumption that human visual perception is highly adapt ..."
Abstract

Cited by 1499 (114 self)
 Add to MetaCart
Objective methods for assessing perceptual image quality have traditionally attempted to quantify the visibility of errors between a distorted image and a reference image using a variety of known properties of the human visual system. Under the assumption that human visual perception is highly
Marching cubes: A high resolution 3D surface construction algorithm
 COMPUTER GRAPHICS
, 1987
"... We present a new algorithm, called marching cubes, that creates triangle models of constant density surfaces from 3D medical data. Using a divideandconquer approach to generate interslice connectivity, we create a case table that defines triangle topology. The algorithm processes the 3D medical d ..."
Abstract

Cited by 2696 (4 self)
 Add to MetaCart
slice connectivity, surface data, and gradient information present in the original 3D data. Results from computed tomography (CT), magnetic resonance (MR), and singlephoton emission computed tomography (SPECT) illustrate the quality and functionality of marching cubes. We also discuss improvements that decrease
Robust Uncertainty Principles: Exact Signal Reconstruction From Highly Incomplete Frequency Information
, 2006
"... This paper considers the model problem of reconstructing an object from incomplete frequency samples. Consider a discretetime signal and a randomly chosen set of frequencies. Is it possible to reconstruct from the partial knowledge of its Fourier coefficients on the set? A typical result of this pa ..."
Abstract

Cited by 2632 (50 self)
 Add to MetaCart
of this paper is as follows. Suppose that is a superposition of spikes @ Aa @ A @ A obeying @�� � A I for some constant H. We do not know the locations of the spikes nor their amplitudes. Then with probability at least I @ A, can be reconstructed exactly as the solution to the I minimization problem I aH @ A s
Actions as spacetime shapes
 IN ICCV
, 2005
"... Human action in video sequences can be seen as silhouettes of a moving torso and protruding limbs undergoing articulated motion. We regard human actions as threedimensional shapes induced by the silhouettes in the spacetime volume. We adopt a recent approach [14] for analyzing 2D shapes and genera ..."
Abstract

Cited by 651 (4 self)
 Add to MetaCart
and viewpoint, high irregularities in the performance of an action, and low quality video.
TABU SEARCH
"... Tabu Search is a metaheuristic that guides a local heuristic search procedure to explore the solution space beyond local optimality. One of the main components of tabu search is its use of adaptive memory, which creates a more flexible search behavior. Memory based strategies are therefore the hallm ..."
Abstract

Cited by 822 (48 self)
 Add to MetaCart
algorithms based on the tabu search. The experimentation shows that the procedures provide high quality solutions to the training problem, and in addition consume a reasonable computational effort.
Cluster Ensembles  A Knowledge Reuse Framework for Combining Multiple Partitions
 Journal of Machine Learning Research
, 2002
"... This paper introduces the problem of combining multiple partitionings of a set of objects into a single consolidated clustering without accessing the features or algorithms that determined these partitionings. We first identify several application scenarios for the resultant 'knowledge reuse&ap ..."
Abstract

Cited by 603 (20 self)
 Add to MetaCart
' framework that we call cluster ensembles. The cluster ensemble problem is then formalized as a combinatorial optimization problem in terms of shared mutual information. In addition to a direct maximization approach, we propose three effective and efficient techniques for obtaining highquality combiners
Clustering by passing messages between data points
 Science
, 2007
"... Clustering data by identifying a subset of representative examples is important for processing sensory signals and detecting patterns in data. Such “exemplars ” can be found by randomly choosing an initial subset of data points and then iteratively refining it, but this works well only if that initi ..."
Abstract

Cited by 696 (8 self)
 Add to MetaCart
if that initial choice is close to a good solution. We devised a method called “affinity propagation,” which takes as input measures of similarity between pairs of data points. Realvalued messages are exchanged between data points until a highquality set of exemplars and corresponding clusters gradually emerges
Coarsetofine nbest parsing and MaxEnt discriminative reranking
 In ACL
, 2005
"... Discriminative reranking is one method for constructing highperformance statistical parsers (Collins, 2000). A discriminative reranker requires a source of candidate parses for each sentence. This paper describes a simple yet novel method for constructing sets of 50best parses based on a co ..."
Abstract

Cited by 522 (15 self)
 Add to MetaCart
Discriminative reranking is one method for constructing highperformance statistical parsers (Collins, 2000). A discriminative reranker requires a source of candidate parses for each sentence. This paper describes a simple yet novel method for constructing sets of 50best parses based on a
On the Selfsimilar Nature of Ethernet Traffic (Extended Version)
, 1994
"... We demonstrate that Ethernet LAN traffic is statistically selfsimilar, that none of the commonly used traffic models is able to capture this fractallike behavior, that such behavior has serious implications for the design, control, and analysis of highspeed, cellbased networks, and that aggrega ..."
Abstract

Cited by 2213 (46 self)
 Add to MetaCart
, and that aggregating streams of such traffic typically intensifies the selfsimilarity (“burstiness”) instead of smoothing it. Our conclusions are supported by a rigorous statistical analysis of hundreds of millions of high quality Ethernet traffic measurements collected between 1989 and 1992, coupled with a
Results 1  10
of
150,507