Results 1  10
of
5,302
Mtree: An Efficient Access Method for Similarity Search in Metric Spaces
, 1997
"... A new access meth d, called Mtree, is proposed to organize and search large data sets from a generic "metric space", i.e. whE4 object proximity is only defined by a distance function satisfyingth positivity, symmetry, and triangle inequality postulates. We detail algorith[ for insertion o ..."
Abstract

Cited by 663 (38 self)
 Add to MetaCart
A new access meth d, called Mtree, is proposed to organize and search large data sets from a generic "metric space", i.e. whE4 object proximity is only defined by a distance function satisfyingth positivity, symmetry, and triangle inequality postulates. We detail algorith[ for insertion
The kcore and branching processes
 Combinatorics, Probability & Computing
"... The kcore of a graph G is the maximal subgraph of G having minimum degree at least k. In 1996, Pittel, Spencer and Wormald found the threshold λc for the emergence of a nontrivial kcore in the random graph G(n, λ/n), and the asymptotic size of the kcore above the threshold. We give a new proof o ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
of this result using a local coupling of the graph to a suitable branching process. This proof extends to a general model of inhomogeneous random graphs with independence between the edges. As an example, we study the kcore in a certain powerlaw or ‘scalefree’ graph with a parameter c controlling the overall
Streaming Algorithms for kcore Decomposition
"... A kcore of a graph is a maximal connected subgraph in which every vertex is connected to at least k vertices in the subgraph. kcore decomposition is often used in largescale network analysis, such as community detection, protein function prediction, visualization, and solving NPHard problems on ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
A kcore of a graph is a maximal connected subgraph in which every vertex is connected to at least k vertices in the subgraph. kcore decomposition is often used in largescale network analysis, such as community detection, protein function prediction, visualization, and solving NPHard problems
kcore decomposition: a tool for the visualization of large scale networks
"... We use the kcore decomposition to visualize large scale complex networks. This decomposition, based on a recursive pruning of the least connected vertices, allows to disentangle the hierarchical structure of networks by progressively focusing on their central cores. By using this strategy we develo ..."
Abstract

Cited by 55 (2 self)
 Add to MetaCart
We use the kcore decomposition to visualize large scale complex networks. This decomposition, based on a recursive pruning of the least connected vertices, allows to disentangle the hierarchical structure of networks by progressively focusing on their central cores. By using this strategy we
MATRIX: MAnyTask computing execution fabRIc at eXascale
"... Efficiently scheduling large number of jobs over largescale distributed systems is critical in achieving high system utilization and throughput. Most of current job management systems (JMS) have centralized Master/Slaves architecture that has inherent limitations, such as scalability issues at extr ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
algorithm for distributed load balancing, and distributed hash tables for managing task metadata. MATRIX supports manytask computing (MTC) workloads with or without task dependencies in the execution of complex largescale workflows. MATRIX has shown throughput as high as 54.4K tasks/sec at 4Kcore scales
ZHT: a Zerohop DHT for HighEnd Computing Environment ABSTRACT
"... One critical component of future file systems for highend computing is metadata management. This work presents ZHT, a zerohop distributed hash table, which has been tuned for the requirements of HEC systems. ZHT aims to be a building block for future distributed file systems to implement distribu ..."
Abstract
 Add to MetaCart
, ranging from a Linux cluster to an IBM BlueGene/P supercomputer. We scaled ZHT up to 16K processes and achieved 4M operations/sec throughput. Latencies have scaled similarly well, with submilliseconds latencies at 4Kcore scales. We compared ZHT against other systems and found it offers superior
Large scale networks fingerprinting and visualization using the kcore decomposition
 Advances in Neural Information Processing Systems 18
, 2006
"... decomposition ..."
MATRIX: MAnyTask computing execution fabRIc at eXascales. http://datasys.cs.iit.edu/projects/MATRIX/index.html
 SimMatrx(Mem/Task) SimGrid(Mem/Task) GridSim(Mem/Task) Scale (No. of Nodes) Scale (No. of Nodes) Figure 4: Resouce consumption comparions among SimMatrix, SimGrid, and GridSim
, 2013
"... Efficiently scheduling large number of jobs over largescale distributed systems is critical in achieving high system utilization and throughput. Today’s stateoftheart job management systems have predominantly Master/Slaves architectures, which have inherent limitations, such as scalability issue ..."
Abstract

Cited by 9 (9 self)
 Add to MetaCart
as task dependencies in the execution of complex largescale workflows. We have evaluated it using synthetic workloads up to 4Kcores on an IBM Blue Gene/P supercomputer, and have shown high efficiency rates (e.g. 85%+) are possible with certain workloads with task granularities as low as 64ms. MATRIX has
Improved Localization of Cortical Activity by Combining EEG and MEG with MRI Cortical Surface Reconstruction: A Linear Approach
 J. Cogn. Neurosci
, 1993
"... We describe a comprehensive linear approach to the prob lem of imaging brain activity with high temporal as well as spatial resolution based on combining EEG and MEG data with anatomical constraints derived from MRI images. The "inverse problem" of estimating the distribution of dipole st ..."
Abstract

Cited by 263 (19 self)
 Add to MetaCart
is recursively fiood4illed ,o determine the topology of the gray.white matter border, and (3) the resulting continuous surface is refinc by relaxing it against the original 3D grayscale image using a deformable template method, which is also used to computationally flatten the cortex for k'asier vic
kcore decomposition: A tool for the analysis of large scale Internet graphs. arXiv:cs.NI/0511007
"... Abstract. We use the kcore decomposition, based on a recursive pruning of the least connected vertices, to study large scale Internet graphs at the Autonomous System level. This approach allows the characterization of progressively central cores of networks, conveniently uncovering hierarchical and ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
Abstract. We use the kcore decomposition, based on a recursive pruning of the least connected vertices, to study large scale Internet graphs at the Autonomous System level. This approach allows the characterization of progressively central cores of networks, conveniently uncovering hierarchical
Results 1  10
of
5,302