Results 1  10
of
16
Terrain simplification simplified: A general framework for viewdependent outofcore visualization
 IEEE TVCG
"... ..."
Global Static Indexing for Realtime Exploration of Very Large Regular Grids
, 2001
"... In this paper we introduce a new indexing scheme for progressive traversal and visualization of large regular grids. We demonstrate the potential of our approach by providing a tool that displays at interactive rates planar slices of scalar field data with very modest computing resources. We obtain ..."
Abstract

Cited by 49 (7 self)
 Add to MetaCart
(Show Context)
In this paper we introduce a new indexing scheme for progressive traversal and visualization of large regular grids. We demonstrate the potential of our approach by providing a tool that displays at interactive rates planar slices of scalar field data with very modest computing resources. We obtain unprecedented results both in terms of absolute performance and, more importantly, in terms of scalability. On a laptop computer we provide real time interaction with a 2048 3 grid (8 Giganodes) using only 20MB of memory. On an SGI Onyx we slice interactively an 8192 3 grid ( teranodes) using only 60MB of memory. The scheme relies simply on the determination of an appropriate reordering of the rectilinear grid data and a progressive construction of the output slice. The reordering minimizes the amount of I/O performed during the outofcore computation. The progressive and asynchronous computation of the output provides flexible quality/speed tradeoffs and a timecritical and interruptible user interface. 1.
Using Spacefilling Curves for Multidimensional Indexing
 Lecture Notes in Computer Science
, 2000
"... . This paper presents and discusses a radically different approach to multidimensional indexing based on the concept of the spacefilling curve. It reports the novel algorithms which had to be developed to create the first actual implementation of a system based on this approach, on some compara ..."
Abstract

Cited by 37 (1 self)
 Add to MetaCart
(Show Context)
. This paper presents and discusses a radically different approach to multidimensional indexing based on the concept of the spacefilling curve. It reports the novel algorithms which had to be developed to create the first actual implementation of a system based on this approach, on some comparative performance tests, and on its actual use within the TriStarp Group at Birkbeck to provide a Triple Store repository. An important result that goes beyond this requirement, however, is that the performance improvement over the Grid File is greater the higher the dimension. 1 Introduction Underlying any dbms is some form of repository management system or data store. The classic and dominant model for such repositories is that of some form of logical record or data aggregate type with a collection of instances conforming to that type usually termed a file. Such file systems are, of course, also used directly in many applications. The data model of a dbms may be radically different f...
Querying multidimensional data indexed using the hilbert spacefilling curve
 SIGMOD Record
, 2001
"... Mapping to onedimensional values and then using a onedimensional indexing method has been proposed as a way of indexing multidimensional data. Most previous related work uses the ZOrder Curve but more recently the Hilbert Curve has been considered since it has superior clustering properties. Any ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
(Show Context)
Mapping to onedimensional values and then using a onedimensional indexing method has been proposed as a way of indexing multidimensional data. Most previous related work uses the ZOrder Curve but more recently the Hilbert Curve has been considered since it has superior clustering properties. Any approach, however, can only be of practical value if there are e ective methods for executing range and partial match queries. This paper describes such amethod for the Hilbert Curve. 1
Calculation of mappings between one and ndimensional values using the hilbert spacefilling curve
, 2000
"... Abstract. This report reproduces and briefly discusses an algorithm proposed by Butz [2] for calculating a mapping between onedimensional values and ndimensional values regarded as being the coordinates of points lying on Hilbert Curves. It suggests some practical improvements to the algorithm and ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
(Show Context)
Abstract. This report reproduces and briefly discusses an algorithm proposed by Butz [2] for calculating a mapping between onedimensional values and ndimensional values regarded as being the coordinates of points lying on Hilbert Curves. It suggests some practical improvements to the algorithm and presents an algorithm for calculating the inverse of the mapping, from ndimensional values to onedimensional values. 1
Feature statistical retrieval applied to contentbased copy identification
 In International Conference on Image Processing
, 2004
"... In many image or video retrieval systems, the search of similar objects in the database includes a spatial access method to a multidimensional feature space. This step is generally considered as a problem independent of the features and the similarity type. The well known multidimensional nearest ne ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
(Show Context)
In many image or video retrieval systems, the search of similar objects in the database includes a spatial access method to a multidimensional feature space. This step is generally considered as a problem independent of the features and the similarity type. The well known multidimensional nearest neighbor search was also widely studied by the database community as a generic method. In this paper, we propose a novel strategy dedicated to pseudoinvariant features retrieval and more specifically applied to contentbased copy identification. The range of a query is computed during the search according to deviation statistics between original and observed features. Furthermore, this approximate search range is directly mapped onto a Hilbert spacefilling curve allowing an efficient access to the database. Experimental results give excellent response times for very large databases both on synthetic and real data. This work is used in a TV monitoring system including more than 13000 hours of video in the reference database. 1.
Statistical similarity search applied to contentbased video copy detection
 in ICDE Workshops
, 2005
"... Abstract—Contentbased copy detection (CBCD) is one of the emerging multimedia applications for which there is a need of a concerted effort from the database community and the computer vision community. Recent methods based on interest points and local fingerprints have been proposed to perform robu ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Contentbased copy detection (CBCD) is one of the emerging multimedia applications for which there is a need of a concerted effort from the database community and the computer vision community. Recent methods based on interest points and local fingerprints have been proposed to perform robust CBCD of images and video. They include two steps: the search of similar fingerprints in the database and a voting strategy that merges all the local results in order to make a global decision. In most image or video retrieval systems, the search of similar features in the database is performed by a geometrical query in a multidimensional index structure. Recently, the paradigm of approximate knearest neighbors query has shown that trading quality for time can be widely profitable in that context. In this paper, we propose a new approximate search paradigm dedicated to local fingerprints and we describe the original indexing structure we have developped to compute efficiently the corresponding queries. We consider that the distribution of the relevant fingerprints around a query can be modeled by the distribution of the distorsion vector between a referenced fingerprint and a candidate one. Experimental results show that these statistical queries allow high performance gains compared to classical ǫrange queries. By studying the influence of this approximate search on a complete CBCD scheme based on local video fingerprints, we also show that trading quality for time during the search does not degrade seriously the global robustness of the system, even with very large databases including more than 10,000 hours of video. I.
Multiresolution Indexing for Hierarchical Outofcore Traversal of Rectilinear Grids
, 2000
"... this paper I introduce a new static indexing scheme that induces a data layout satisfying both requirements (i) and (ii) for the hierarchical traversal of ndimensional regular grids. In one particular implementation the scheme exploits in a new way the recursive construction of the Zorder space fi ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
this paper I introduce a new static indexing scheme that induces a data layout satisfying both requirements (i) and (ii) for the hierarchical traversal of ndimensional regular grids. In one particular implementation the scheme exploits in a new way the recursive construction of the Zorder space filling curve. The standard indexing that maps the input D data onto a 1D sequence for the Zorder curve is based on a simple bit interleaving operation that merges the input indices into one index n times longer. This helps in grouping the data for geometric proximity but only for a specific level of detail. In this paper I show how this indexing can be transformed into an alternative index that allows to group the data per level of resolution first and then the data within each level per geometric proximity. This yields a data layout that is appropriate for hierarchical outofcore processing of large grids
Multiresolution Modeling, Visualization and Compression of Volumetric Data
 Notes for the 3rd Tutorial of the IEEE Conference on Visualization
, 2003
"... Due to the increasing complexity of volume meshes produced in many applications, multiresolution representations, simplification and compression have become key technologies for achieving efficient storage together with interactive modeling and visualization performance. Viewdependent adaptive appr ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Due to the increasing complexity of volume meshes produced in many applications, multiresolution representations, simplification and compression have become key technologies for achieving efficient storage together with interactive modeling and visualization performance. Viewdependent adaptive approximations can be extracted from multiresolution representations enabling realtime isosurfacing and direct volume rendering. This course covers the foundations in the construction of multiresolution volume meshes, simplification and realtime extraction of adaptive approximations for visualization. Advanced topics include subdivision methods, compression, progressive transmission and outof core processing. The course is intended for programmers or researchers interested in developing efficient, interactive modeling and visualization of 3D volumetric models. Part I
Componentbased Data Layout for Efficient Slicing of Very Large Multidimensional Volumetric Data
"... Abstract — In this paper, we introduce a new efficient data layout scheme to efficiently handle outofcore axisaligned slicing queries of very large multidimensional volumetric data. Slicing is a very useful dimension reduction tool that removes or reduces occlusion problems in visualizing 3D/4D ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract — In this paper, we introduce a new efficient data layout scheme to efficiently handle outofcore axisaligned slicing queries of very large multidimensional volumetric data. Slicing is a very useful dimension reduction tool that removes or reduces occlusion problems in visualizing 3D/4D volumetric data sets and that enables fast visual exploration of such data sets. We show that the data layouts based on typical spacefilling curves are not optimal for the outofcore slicing queries and present a novel componentbased data layout scheme for a specialized problem domain, in which it is only required to provide fast slicing at every kth value, for any k> 1. Our componentbased data layout scheme provides much faster processing time for any axisaligned slicing direction at every kth value, k> 1, requiring less cache memory size and without any replication of data. In addition, the data layout can be generalized to any high dimension. I.