Results 1 
9 of
9
Distributed Learning on Very Large Data Sets
 In Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
, 2000
"... One approach to learning from intractably large data sets is to utilize all the training data by learning models on tractably sized subsets of the data. The subsets of data may be disjoint or partially overlapping. The individual learned models may be combined into a single model or a voting approac ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
One approach to learning from intractably large data sets is to utilize all the training data by learning models on tractably sized subsets of the data. The subsets of data may be disjoint or partially overlapping. The individual learned models may be combined into a single model or a voting approachmay be used to combine the classi#cations of a set of models. An approach to learning models in parallel from arbitrarily large training data sets and combining them into a classi#er is described. The training sets are disjoint in the work described here. A parallel implementation on the DOE's ASCI Red parallel supercomputer is described. Results with data sets small enough to be handled by a single processor show that data sets can be divided into a moderate number of distinct subsets without degrading classi#er accuracy. Speedup results are shown for a parallel implementation on the ASCI Red with data sets too large to be handled on a single processor. Training sets of size 3 to 50 millio...
De Novo Ultrascale Atomistic Simulations On HighEnd Parallel Supercomputers
 International Journal of High Performance Computing Applications
"... We present a de novo hierarchical simulation framework for firstprinciples based predictive simulations of materials and their validation on highend parallel supercomputers and geographically distributed clusters. In this framework, highend chemically reactive and nonreactive molecular dynamics ( ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
We present a de novo hierarchical simulation framework for firstprinciples based predictive simulations of materials and their validation on highend parallel supercomputers and geographically distributed clusters. In this framework, highend chemically reactive and nonreactive molecular dynamics (MD) simulations explore a wide solution space to discover microscopic mechanisms that govern macroscopic material properties, into which highly accurate quantum mechanical (QM) simulations are embedded to validate the discovered mechanisms and quantify the uncertainty of the solution. The framework includes an embedded divideandconquer (EDC) algorithmic framework for the design of linearscaling simulation algorithms with minimal bandwidth complexity and tight error control. The EDC framework also enables adaptive hierarchical simulation with automated
Coherent Culling and Shading for Large Molecular Dynamics Visualization
 In Eurographics/IEEE Symposium on Visualization
, 2010
"... Molecular dynamics simulations are a principal tool for studying molecular systems. Such simulations are used to investigate molecular structure, dynamics, and thermodynamical properties, as well as a replacement for, or complement to, costly and dangerous experiments. With the increasing availabili ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Molecular dynamics simulations are a principal tool for studying molecular systems. Such simulations are used to investigate molecular structure, dynamics, and thermodynamical properties, as well as a replacement for, or complement to, costly and dangerous experiments. With the increasing availability of computational power the resulting data sets are becoming increasingly larger, and benchmarks indicate that the interactive visualization on desktop computers poses a challenge when rendering substantially more than millions of glyphs. Trading visual quality for rendering performance is a common approach when interactivity has to be guaranteed. In this paper we address both problems and present a method for highquality visualization of massive molecular dynamics data sets. We employ several optimization strategies on different levels of granularity, such as data quantization, data caching in video memory, and a twolevel occlusion culling strategy: coarse culling via hardware occlusion queries and a vertexlevel culling using maximum depth mipmaps. To ensure optimal image quality we employ GPU raycasting and deferred shading with smooth normal vector generation. We demonstrate that our method allows us to interactively render data sets containing tens of millions of highquality glyphs.
Approximate covering detection among contentbased subscriptions using space filling curves
 in IEEE International Conference on Distributed Computing Systems
, 2007
"... We consider a problem that arises during the propagation of subscriptions in a contentbased publishsubscribe system. Subscription covering is a promising optimization that reduces the number of subscriptions propagated, and hence the size of routing tables in a contentbased publishsubscribe syst ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We consider a problem that arises during the propagation of subscriptions in a contentbased publishsubscribe system. Subscription covering is a promising optimization that reduces the number of subscriptions propagated, and hence the size of routing tables in a contentbased publishsubscribe system. However, detecting covering relationships among subscriptions can be an expensive computational task that potentially reduces the utility of covering as an optimization. We introduce an alternate approach approximate subscription covering, which provide much of the benefits of subscription covering at a fraction of its cost. By forgoing an exhaustive search for covering subscriptions in favor of an approximate search, it is shown that the time complexity of covering detection can be dramatically reduced. The trade off between efficiency of covering detection and the approximation error is demonstrated through the analysis of indexes for multiattribute subscriptions based on space filling curves. 1
A formal analysis of space filling curves for parallel domain decomposition
 In Proc. IEEE International Conference on Parallel Processing
, 2006
"... Space filling curves (SFCs) are widely used for parallel domain decomposition in scientific computing applications. The proximity preserving properties of SFCs are expected to keep most accesses local in applications that require efficient access to spatial neighborhoods. While experimental results ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Space filling curves (SFCs) are widely used for parallel domain decomposition in scientific computing applications. The proximity preserving properties of SFCs are expected to keep most accesses local in applications that require efficient access to spatial neighborhoods. While experimental results are used to confirm this behavior, a rigorous mathematical analysis of SFCs turns out to be rather hard and rarely attempted. In this paper, we analyze SFC based parallel domain decomposition for a uniform random spatial distribution in three dimensions. Let n denote the expected number of points and P denote the number of processors. We show that the expected distance along an SFC to a nearest neighbor is O(n 2/3). We then consider the problem of answering nearest neighbor and spherical region queries for each point. For P = n α (0 < α ≤ 1) processors, we show that the total number of remote accesses grows as O(n 3/4+α/4). This analysis shows that the expected number of total remote accesses is sublinear for any sublinear number of processors. We view the analysis presented here as a step towards the goal of understanding the utility of SFCs in scientific applications and the analysis of more complex spatial distributions. Key words: domain decomposition, parallel algorithms, probabilistic analysis, space filling curves.
A Comprehensive Integration and Analysis of Dynamic Load Balancing Architectures Within Molecular Dynamics
, 2009
"... ..."
unknown title
"... Engineering mechanics provides excellent theoretical descriptions for the rational design of materials and accuratelifetime prediction of mechanical structures. This approach deals with continuous quantities such as strain field that are functions of both space and time. Constitutive relations s ..."
Abstract
 Add to MetaCart
Engineering mechanics provides excellent theoretical descriptions for the rational design of materials and accuratelifetime prediction of mechanical structures. This approach deals with continuous quantities such as strain field that are functions of both space and time. Constitutive relations such as Hooke’s law for deformation and Coulomb’s law for friction describe the relationships between these macroscopic fields. These constitutive equations contain materialspecific parameters such as elastic moduli and friction coefficients, which are often size dependent. For example, the mechanical strength of materials is inversely proportional to the square root of the grain size, according to the HallPetch relationship. Such scaling laws are usually validated experimentally at length scales above a micron, but interest is growing in extending constitutive relations and scaling laws down to a few nanometers. This is because many experts believe that by reducing the structural scale (such as grain sizes) to the nanometer range, we can extend material properties such as strength and toughness beyond the current engineeringmaterials limit.1 In addition, widespread use of nanoelectromechanical systems (NEMS) is making their durability a critical issue, to which scaling down engineeringmechanics concepts is essential. Because of the large surfacetovolume ratios in these nanoscale systems, new engineeringmechanics concepts reflecting the enhanced role of interfacial processes might even be necessary. Atomistic simulations will likely play an important role in scaling down engineeringmechanics concepts to nanometer scales. Recent advances in computational methodologies and massively parallel computers have let researchers carry out 10 to 100millionatom atomistic simulations (the typical linear dimen
Volume xx (200y), Number z, pp. 1–6 HardwareAccelerated Glyphs for Mono and Dipoles in Molecular Dynamics Visualization
"... We present a novel visualization method for mono and dipolar molecular simulations from thermodynamics that takes advantage of modern graphics hardware to interactively render specifically tailored glyphs. Our approach allows domain experts to visualize the results of molecular dynamics simulations ..."
Abstract
 Add to MetaCart
We present a novel visualization method for mono and dipolar molecular simulations from thermodynamics that takes advantage of modern graphics hardware to interactively render specifically tailored glyphs. Our approach allows domain experts to visualize the results of molecular dynamics simulations with a higher number of particles than before and furthermore offers much better visual quality. We achieve this by transferring only visualization parameters to the GPU and by generating implicit surfaces directly in the fragment program. As a result, we can render up to 500.000 glyphs with about 10 fps displaying all the simulation results as geometrical properties that resemble the classical abstract representation used in this research area. Thus we enable researchers to visually assess the results of simulations of greater scale than before. We believe that the proposed method can be generalized to create other kinds of parametrized surfaces directly on graphics hardware to overcome the bandwidth bottleneck that exists between CPU and GPU. Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: ThreeDimensional Graphics and Realism,I.3.8 [Computer Graphics]: Applications
unknown title
, 2005
"... the early 1990s, complex phenomena such as ductile fracture, fracture of nonhomogeneous materials, fracture interaction with other physical and chemical effects have increasingly been investigated by atomistic simulation. This paper ..."
Abstract
 Add to MetaCart
the early 1990s, complex phenomena such as ductile fracture, fracture of nonhomogeneous materials, fracture interaction with other physical and chemical effects have increasingly been investigated by atomistic simulation. This paper