Results 1  10
of
35
High performance scalable image compression with EBCOT
 IEEE Trans. Image Processing
, 2000
"... A new image compression algorithm is proposed, based on independent Embedded Block Coding with Optimized Truncation of the embedded bitstreams (EBCOT). The algorithm exhibits stateoftheart compression performance while producing a bitstream with a rich feature set, including resolution and SNR ..."
Abstract

Cited by 582 (11 self)
 Add to MetaCart
A new image compression algorithm is proposed, based on independent Embedded Block Coding with Optimized Truncation of the embedded bitstreams (EBCOT). The algorithm exhibits stateoftheart compression performance while producing a bitstream with a rich feature set, including resolution and SNR scalability together with a random access property. The algorithm has modest complexity and is extremely well suited to applications involving remote browsing of large compressed images. The algorithm lends itself to explicit optimization with respect to MSE as well as more realistic psychovisua1 metrics, capable of modeling the spatially varying visual masking phenomenon.
Wavelet Filter Evaluation for Image Compression
 IEEE Transactions on Image Processing
, 1995
"... AbstractChoke of fflter bank En wavelet ampredon is a critical Loaw that afXecta image qunlrtJl as wdl as system design. Although r e g " i t y fs sometimes wed in Bltv evrhutloa, its s u c a w at pndietlng compression perfmmme Is only parlid. A more reliable evaluation ean be Obtsiaed by d d ..."
Abstract

Cited by 177 (4 self)
 Add to MetaCart
AbstractChoke of fflter bank En wavelet ampredon is a critical Loaw that afXecta image qunlrtJl as wdl as system design. Although r e g " i t y fs sometimes wed in Bltv evrhutloa, its s u c a w at pndietlng compression perfmmme Is only parlid. A more reliable evaluation ean be Obtsiaed by d d n g JUI Llevel synthcds/malysls system as a singbinput,
Image Compression by Linear Splines over Adaptive Triangulations
"... This paper proposes a new method for image compression. The method is based on the approximation of an image, regarded as a function, by a linear spline over an adapted triangulation, D(Y ), which is the Delaunay triangulation of a small set Y of significant pixels. The linear spline minimizes the d ..."
Abstract

Cited by 42 (9 self)
 Add to MetaCart
(Show Context)
This paper proposes a new method for image compression. The method is based on the approximation of an image, regarded as a function, by a linear spline over an adapted triangulation, D(Y ), which is the Delaunay triangulation of a small set Y of significant pixels. The linear spline minimizes the distance to the image, measured by the mean square error, among all linear splines over D(Y ). The significant pixels in Y are selected by an adaptive thinning algorithm, which recursively removes less significant pixels in a greedy way, using a sophisticated criterion for measuring the significance of a pixel. The proposed compression method combines the approximation scheme with a customized scattered data coding scheme. We demonstrate that our compression method outperforms JPEG2000 on two geometric images and performs competitively with JPEG2000 on three popular test cases of real images.
Adaptive multivariate approximation using binary space partitions and geometric wavelets
 SIAM Journal on Numerical Analysis, To Appear
, 2005
"... Abstract. The Binary Space Partition (BSP) technique is a simple and efficient method to adaptively partition an initial given domain to match the geometry of a given input function. As such the BSP technique has been widely used by practitioners, but up until now no rigorous mathematical justificat ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The Binary Space Partition (BSP) technique is a simple and efficient method to adaptively partition an initial given domain to match the geometry of a given input function. As such the BSP technique has been widely used by practitioners, but up until now no rigorous mathematical justification to it has been offered. Here we attempt to put the technique on sound mathematical foundations, and we offer an enhancement of the BSP algorithm in the spirit of what we are going to call geometric wavelets. This new approach to sparse geometric representation is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. We provide numerical examples of nterm geometric wavelet approximations of known test images and compare them with dyadic wavelet approximation. We also discuss applications to image denoising and compression.
Complex, LinearPhase Filters for Efficient Image Coding
, 1995
"... With the exception of the Haar basis, realvalued orthogonal wavelet filter banks with compact support lack symmetry and therefore do not possess linear phase. This has led to the use of biorthogonal filters for coding of images and other multidimensional data. There are, however, complex solutions ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
With the exception of the Haar basis, realvalued orthogonal wavelet filter banks with compact support lack symmetry and therefore do not possess linear phase. This has led to the use of biorthogonal filters for coding of images and other multidimensional data. There are, however, complex solutions permitting the construction of compactly supported, orthogonal linear phase QMF filter banks. By explicitly seeking solutions in which the imaginary part of the filter coefficients is small enough to be approximated to zero, real symmetric filters can be obtained that achieve excellent compression performance.
Image coding with geometric wavelets
"... This paper describes a new and efficient method for low bitrate image coding which is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. It combines a Binary Space Partition (BSP) scheme with Geometric Wavelet (GW) tree approximation so as to eff ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
(Show Context)
This paper describes a new and efficient method for low bitrate image coding which is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. It combines a Binary Space Partition (BSP) scheme with Geometric Wavelet (GW) tree approximation so as to efficiently capture curve singularities and provide a sparse representation of the image. The GW method successfully competes with stateoftheart wavelet methods such as the EZW, SPIHT and EBCOT algorithms. We report a gain of about 0.4 dB over the SPIHT and EBCOT algorithms at the bitrate 0.0625 bitsperpixels (bpp). It also outperforms other recent methods that are based on ‘sparse geometric representation’. For example, we report a gain of 0.27 dB over the Bandelets algorithm at 0.1 bpp. Although the algorithm is computationally intensive, its time complexity can be significantely reduced by collecting a ‘global ’ GW nterm approximation to the image from a collection of GW trees, each constructed separately over tiles of the image. 1
Advanced Techniques for High Quality Multiresolution Volume Rendering
 In Computers & Graphics (2004
, 2004
"... We present several improvements for compression based multiresolution rendering of very large volume data sets at interactive to realtime frame rates on standard PC hardware. The algorithm accepts scalar or multivariant data sampled on a regular grid as input. The input data is converted into a c ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
We present several improvements for compression based multiresolution rendering of very large volume data sets at interactive to realtime frame rates on standard PC hardware. The algorithm accepts scalar or multivariant data sampled on a regular grid as input. The input data is converted into a compressed hierarchical wavelet representation in a preprocessing step. During rendering, the wavelet representation is decompressed onthefly and rendered using hardware texture mapping. The levelofdetail used for rendering is adapted to the estimated screenspace error. To increase the rendering performance additional visibility tests, such as empty space skipping and occlusion culling, are applied. Furthermore we discuss how to render the remaining multiresolution blocks efficiently using modern graphics hardware. Using a prototype implementation of this algorithm we are able to perform a high quality interactive rendering of large data sets on a single offtheshelf PC.
AUTOMATIC COMPRESSION FOR IMAGE SETS USING A GRAPH THEORETICAL FRAMEWORK
"... Kaleb, and Noah. iii A new automatic compression scheme that adapts to any image set is presented in this thesis. The proposed scheme requires no a priori knowledge on the properties of the image set. This scheme is obtained using a unified graphtheoretical framework that allows for compression st ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
Kaleb, and Noah. iii A new automatic compression scheme that adapts to any image set is presented in this thesis. The proposed scheme requires no a priori knowledge on the properties of the image set. This scheme is obtained using a unified graphtheoretical framework that allows for compression strategies to be compared both theoretically and experimentally. This strategy achieves optimal lossless compression by computing a minimum spanning tree of a graph constructed from the image set. For lossy compression, this scheme is nearoptimal and a performance guarantee relative to the optimal one is provided. Experimental results demonstrate that this compression strategy compares favorably to the previously proposed strategies, with improvements up to 7 % in the case of lossless compression and 72 % in the case of lossy compression. This thesis also shows that the choice of underlying compression algorithm is important for compressing image sets using the proposed scheme. iv Acknowledgments I wish to thank my supervisor Dr. Howard Cheng. It would not have been possible to complete this work without his encouragement, patience, suggestions, generosity, support, and friendship. I would like to thank my committee members Dr. Stephen Wismath and Dr. Craig Coburn for their advice and willingness to help during the past two years. From the University of Alberta, I wish to thank Dr. Xiaobo Li for his suggestions and support, and Dr. Herb Yang for serving as the external examiner. I would like to acknowledge the faculty and staff of the Department of Mathematics
CREW Lossless/Lossy Medical Image Compression
 RICOH CALIFORNIA RESEARCH CENTER
, 1995
"... ... This document describes the CREW technology in detail in accordance with the evaluation criteria established by the ACRNEMA WG IV committee. More information and RICOH California Research Center papers relating to CREW can be found on the World Wide Web at http://www.crc.ricoh.com/CREW ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
... This document describes the CREW technology in detail in accordance with the evaluation criteria established by the ACRNEMA WG IV committee. More information and RICOH California Research Center papers relating to CREW can be found on the World Wide Web at http://www.crc.ricoh.com/CREW
Optimal erasure protection assignment for scalabe compressed data with small channel packets and short channel codewords,” Under review for
 EURASIP JASP Special Issue: Multimedia over IP and Wireless Networks
, 2004
"... This paper is concerned with the efficient transmission of scalable compressed images with complex dependency structures over lossy communication channels. Our recent work proposed a strategy for allocating source elements into clusters of packets and finding their optimal coderates. However, the p ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
This paper is concerned with the efficient transmission of scalable compressed images with complex dependency structures over lossy communication channels. Our recent work proposed a strategy for allocating source elements into clusters of packets and finding their optimal coderates. However, the previous work assumes that source elements form a simple chain of dependencies. The present paper proposes a modification to the earlier strategy to exploit the properties of scalable sources which have treestructured dependency. The source elements are allocated to clusters of packets according to their dependency structure, subject to constraints on packet size and channel codeword length. Given a packet cluster arrangement, the proposed strategy assigns optimal coderates to the source elements, subject to a constraint on transmission length. Our experimental results suggest that the proposed strategy can outperform the earlier strategy by exploiting the dependency structure. 1.