Results 1 
6 of
6
On the construction of some capacityapproaching coding schemes
, 2000
"... This thesis proposes two constructive methods of approaching the Shannon limit very closely. Interestingly, these two methods operate in opposite regions, one has a block length of one and the other has a block length approaching infinity. The first approach is based on novel memoryless joint source ..."
Abstract

Cited by 56 (2 self)
 Add to MetaCart
This thesis proposes two constructive methods of approaching the Shannon limit very closely. Interestingly, these two methods operate in opposite regions, one has a block length of one and the other has a block length approaching infinity. The first approach is based on novel memoryless joint sourcechannel coding schemes. We first show some examples of sources and channels where no coding is optimal for all values of the signaltonoise ratio (SNR). When the source bandwidth is greater than the channel bandwidth, joint coding schemes based on spacefilling curves and other families of curves are proposed. For uniform sources and modulo channels, our coding scheme based on spacefilling curves operates within 1.1 dB of Shannon’s ratedistortion bound. For Gaussian sources and additive white Gaussian noise (AWGN) channels, we can achieve within 0.9 dB of the ratedistortion bound. The second scheme is based on lowdensity paritycheck (LDPC) codes. We first demonstrate that we can translate threshold values of an LDPC code between channels accurately using a simple mapping. We develop some models for density evolution
On the Metric Properties of Discrete SpaceFilling Curves
, 1996
"... A spacefilling curve is a linear traversal of a discrete finite multidimensional space. In order that this traversal be useful in many applications, the curve should preserve "locality". We quantify "locality" and bound the locality of multidimensional spacefilling curves. Classic Hilbert spacef ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
A spacefilling curve is a linear traversal of a discrete finite multidimensional space. In order that this traversal be useful in many applications, the curve should preserve "locality". We quantify "locality" and bound the locality of multidimensional spacefilling curves. Classic Hilbert spacefilling curves come close to achieving optimal locality. EDICS: IP 3.1 Corresponding author: Craig Gotsman Dept. of Computer Science Technion, Haifa 32000 Israel Tel: +9724294336 Fax: +9724294353 Email: gotsman@cs.technion.ac.il # A preliminary version of this work was presented at the IEEE International Conference on Pattern Recognition, Jerusalem, 1994. 1 1
Towards Optimal Locality in MeshIndexings
, 1997
"... The efficiency of many data structures and algorithms relies on "localitypreserving" indexing schemes for meshes. We concentrate on the case in which the maximal distance between two mesh nodes indexed i and j shall be a slowgrowing function of ji jj. We present a new 2D indexing scheme we call H ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
The efficiency of many data structures and algorithms relies on "localitypreserving" indexing schemes for meshes. We concentrate on the case in which the maximal distance between two mesh nodes indexed i and j shall be a slowgrowing function of ji jj. We present a new 2D indexing scheme we call Hindexing , which has superior (possibly optimal) locality in comparison with the wellknown Hilbert indexings. Hindexings form a Hamiltonian cycle and we prove that they are optimally localitypreserving among all cyclic indexings. We provide fairly tight lower bounds for indexings without any restriction. Finally, illustrated by investigations concerning 2D and 3D Hilbert indexings, we present a framework for mechanizing upper bound proofs for locality.
Algorithms for Rendering Realistic Terrain Image Sequences and Their Parallel Implementation
 The Visual Computer
, 1995
"... We present algorithms for rendering realistic images of large terrains and their implementation on a parallel computer for rapid production of terrain animation sequences. By "large" terrains, we mean the use of datasets too large to be contained in RAM, a fact which significantly influences the des ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
We present algorithms for rendering realistic images of large terrains and their implementation on a parallel computer for rapid production of terrain animation sequences. By "large" terrains, we mean the use of datasets too large to be contained in RAM, a fact which significantly influences the design of rendering algorithms, and has not been addressed in most other works on this subject. To achieve a good speed without quality degradation, we use a hybrid raycasting and projection technique, incorporating quadtree subdivision techniques and filter using precomputed bit masks. Hilbert spacefilling curves determine the image pixel rendering order, fully exploiting spatial coherence. A parallel version of the algorithm is presented, based on an architecture implemented on a Meiko parallel computer. The architecture is designed to relieve dataflow bottlenecks caused by slow interprocessor communications, and exploit temporal image coherence. Using our parallel system, incorporating 26 processors, we are able to generate a full color terrain image at video resolution, without noticable aliasing artifacts, every 2 seconds, including I/O and communication overheads. The speedup of the system is linear with the number of processors.
Texture Mixing via Universal Simulation
"... A framework for studying texture in general, and for texture mixing in particular, is presented in this paper. The work follows concepts from universal type classes and universal simulation. Based on the wellknown Lempel and Ziv (LZ) universal compression scheme, the universal type class of a one d ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
A framework for studying texture in general, and for texture mixing in particular, is presented in this paper. The work follows concepts from universal type classes and universal simulation. Based on the wellknown Lempel and Ziv (LZ) universal compression scheme, the universal type class of a one dimensional sequence is defined as the set of possible sequences of the same length which produce the same dictionary (or parsing tree) with the classical LZ incremental parsing algorithm. Universal simulation is realized by sampling uniformly from the universal type class, which can be efficiently implemented. Starting with a source texture image, we use universal simulation to synthesize new textures that have, asymptotically, the same statistics as the source texture, yet have as much uncertainty as possible, in the sense that they are sampled from the broadest pool of possible sequences that comply with the statistical constraint. When considering two or more textures, a parsing tree is constructed for each one, and samples from the trees are randomly interleaved according to predefined proportions, thus obtaining a mixed texture. As with single texture synthesis, the kth order statistics of this mixture, for any k, approach, asymptotically, the weighted mixture of the kth order statistics of each individual texture used in the mixing. We present the underlying principles of universal types, universal simulation, and their extensions and application to mixing two or more textures with predefined proportions.
and Synthesis, pp. 65–70, 2005. Texture Mixing via Universal Simulation ∗
"... A framework for studying texture in general, and for texture mixing in particular, is presented in this paper. The work follows concepts from universal type classes and universal simulation. Based on the wellknown Lempel and Ziv (LZ) universal compression scheme, the universal type class of a one d ..."
Abstract
 Add to MetaCart
A framework for studying texture in general, and for texture mixing in particular, is presented in this paper. The work follows concepts from universal type classes and universal simulation. Based on the wellknown Lempel and Ziv (LZ) universal compression scheme, the universal type class of a one dimensional sequence is defined as the set of possible sequences of the same length which produce the same dictionary (or parsing tree) with the classical LZ incremental parsing algorithm. Universal simulation is realized by sampling uniformly from the universal type class, which can be efficiently implemented. Starting with a source texture image, we use universal simulation to synthesize new textures that have, asymptotically, the same statistics of any order as the source texture, yet have as much uncertainty as possible, in the sense that they are sampled from the broadest pool of possible sequences that comply with the statistical constraint. When considering two or more textures, a parsing tree is constructed for each one, and samples from the trees are randomly interleaved according to predefined proportions, thus obtaining a mixed texture. As with single texture synthesis, the kth order statistics of this mixture, for any k, asymptotically approach the weighted mixture of the kth order statistics of each individual texture used in the mixing. We present the underlying principles of universal types, universal simulation, and their extensions and application to mixing two or more textures with predefined proportions. 1