Results 1  10
of
28
High performance scalable image compression with EBCOT
 IEEE Trans. Image Processing
, 2000
"... A new image compression algorithm is proposed, based on independent Embedded Block Coding with Optimized Truncation of the embedded bitstreams (EBCOT). The algorithm exhibits stateoftheart compression performance while producing a bitstream with a rich feature set, including resolution and SNR ..."
Abstract

Cited by 460 (8 self)
 Add to MetaCart
A new image compression algorithm is proposed, based on independent Embedded Block Coding with Optimized Truncation of the embedded bitstreams (EBCOT). The algorithm exhibits stateoftheart compression performance while producing a bitstream with a rich feature set, including resolution and SNR scalability together with a random access property. The algorithm has modest complexity and is extremely well suited to applications involving remote browsing of large compressed images. The algorithm lends itself to explicit optimization with respect to MSE as well as more realistic psychovisua1 metrics, capable of modeling the spatially varying visual masking phenomenon.
Wavelet Filter Evaluation for Image Compression
 IEEE Transactions on Image Processing
, 1995
"... AbstractChoke of fflter bank En wavelet ampredon is a critical Loaw that afXecta image qunlrtJl as wdl as system design. Although r e g " i t y fs sometimes wed in Bltv evrhutloa, its s u c a w at pndietlng compression perfmmme Is only parlid. A more reliable evaluation ean be Obtsiaed by d d ..."
Abstract

Cited by 151 (4 self)
 Add to MetaCart
AbstractChoke of fflter bank En wavelet ampredon is a critical Loaw that afXecta image qunlrtJl as wdl as system design. Although r e g " i t y fs sometimes wed in Bltv evrhutloa, its s u c a w at pndietlng compression perfmmme Is only parlid. A more reliable evaluation ean be Obtsiaed by d d n g JUI Llevel synthcds/malysls system as a singbinput,
Image Compression by Linear Splines over Adaptive Triangulations
"... This paper proposes a new method for image compression. The method is based on the approximation of an image, regarded as a function, by a linear spline over an adapted triangulation, D(Y ), which is the Delaunay triangulation of a small set Y of significant pixels. The linear spline minimizes the d ..."
Abstract

Cited by 37 (8 self)
 Add to MetaCart
(Show Context)
This paper proposes a new method for image compression. The method is based on the approximation of an image, regarded as a function, by a linear spline over an adapted triangulation, D(Y ), which is the Delaunay triangulation of a small set Y of significant pixels. The linear spline minimizes the distance to the image, measured by the mean square error, among all linear splines over D(Y ). The significant pixels in Y are selected by an adaptive thinning algorithm, which recursively removes less significant pixels in a greedy way, using a sophisticated criterion for measuring the significance of a pixel. The proposed compression method combines the approximation scheme with a customized scattered data coding scheme. We demonstrate that our compression method outperforms JPEG2000 on two geometric images and performs competitively with JPEG2000 on three popular test cases of real images.
Adaptive multivariate approximation using binary space partitions and geometric wavelets
 SIAM Journal on Numerical Analysis, To Appear
, 2005
"... Abstract. The Binary Space Partition (BSP) technique is a simple and efficient method to adaptively partition an initial given domain to match the geometry of a given input function. As such the BSP technique has been widely used by practitioners, but up until now no rigorous mathematical justificat ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The Binary Space Partition (BSP) technique is a simple and efficient method to adaptively partition an initial given domain to match the geometry of a given input function. As such the BSP technique has been widely used by practitioners, but up until now no rigorous mathematical justification to it has been offered. Here we attempt to put the technique on sound mathematical foundations, and we offer an enhancement of the BSP algorithm in the spirit of what we are going to call geometric wavelets. This new approach to sparse geometric representation is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. We provide numerical examples of nterm geometric wavelet approximations of known test images and compare them with dyadic wavelet approximation. We also discuss applications to image denoising and compression.
Complex, LinearPhase Filters for Efficient Image Coding
, 1995
"... With the exception of the Haar basis, realvalued orthogonal wavelet filter banks with compact support lack symmetry and therefore do not possess linear phase. This has led to the use of biorthogonal filters for coding of images and other multidimensional data. There are, however, complex solutions ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
With the exception of the Haar basis, realvalued orthogonal wavelet filter banks with compact support lack symmetry and therefore do not possess linear phase. This has led to the use of biorthogonal filters for coding of images and other multidimensional data. There are, however, complex solutions permitting the construction of compactly supported, orthogonal linear phase QMF filter banks. By explicitly seeking solutions in which the imaginary part of the filter coefficients is small enough to be approximated to zero, real symmetric filters can be obtained that achieve excellent compression performance.
Advanced Techniques for High Quality Multiresolution Volume Rendering
 In Computers & Graphics (2004
, 2004
"... We present several improvements for compression based multiresolution rendering of very large volume data sets at interactive to realtime frame rates on standard PC hardware. The algorithm accepts scalar or multivariant data sampled on a regular grid as input. The input data is converted into a c ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
We present several improvements for compression based multiresolution rendering of very large volume data sets at interactive to realtime frame rates on standard PC hardware. The algorithm accepts scalar or multivariant data sampled on a regular grid as input. The input data is converted into a compressed hierarchical wavelet representation in a preprocessing step. During rendering, the wavelet representation is decompressed onthefly and rendered using hardware texture mapping. The levelofdetail used for rendering is adapted to the estimated screenspace error. To increase the rendering performance additional visibility tests, such as empty space skipping and occlusion culling, are applied. Furthermore we discuss how to render the remaining multiresolution blocks efficiently using modern graphics hardware. Using a prototype implementation of this algorithm we are able to perform a high quality interactive rendering of large data sets on a single offtheshelf PC.
Optimal erasure protection assignment for scalabe compressed data with small channel packets and short channel codewords,” Under review for
 EURASIP JASP Special Issue: Multimedia over IP and Wireless Networks
, 2004
"... This paper is concerned with the efficient transmission of scalable compressed images with complex dependency structures over lossy communication channels. Our recent work proposed a strategy for allocating source elements into clusters of packets and finding their optimal coderates. However, the p ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
This paper is concerned with the efficient transmission of scalable compressed images with complex dependency structures over lossy communication channels. Our recent work proposed a strategy for allocating source elements into clusters of packets and finding their optimal coderates. However, the previous work assumes that source elements form a simple chain of dependencies. The present paper proposes a modification to the earlier strategy to exploit the properties of scalable sources which have treestructured dependency. The source elements are allocated to clusters of packets according to their dependency structure, subject to constraints on packet size and channel codeword length. Given a packet cluster arrangement, the proposed strategy assigns optimal coderates to the source elements, subject to a constraint on transmission length. Our experimental results suggest that the proposed strategy can outperform the earlier strategy by exploiting the dependency structure. 1.
CREW Lossless/Lossy Medical Image Compression
 RICOH CALIFORNIA RESEARCH CENTER
, 1995
"... ... This document describes the CREW technology in detail in accordance with the evaluation criteria established by the ACRNEMA WG IV committee. More information and RICOH California Research Center papers relating to CREW can be found on the World Wide Web at http://www.crc.ricoh.com/CREW ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
... This document describes the CREW technology in detail in accordance with the evaluation criteria established by the ACRNEMA WG IV committee. More information and RICOH California Research Center papers relating to CREW can be found on the World Wide Web at http://www.crc.ricoh.com/CREW
Image coding with geometric wavelets
"... This paper describes a new and efficient method for low bitrate image coding which is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. It combines a Binary Space Partition (BSP) scheme with Geometric Wavelet (GW) tree approximation so as to eff ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
This paper describes a new and efficient method for low bitrate image coding which is based on recent development in the theory of multivariate nonlinear piecewise polynomial approximation. It combines a Binary Space Partition (BSP) scheme with Geometric Wavelet (GW) tree approximation so as to efficiently capture curve singularities and provide a sparse representation of the image. The GW method successfully competes with stateoftheart wavelet methods such as the EZW, SPIHT and EBCOT algorithms. We report a gain of about 0.4 dB over the SPIHT and EBCOT algorithms at the bitrate 0.0625 bitsperpixels (bpp). It also outperforms other recent methods that are based on ‘sparse geometric representation’. For example, we report a gain of 0.27 dB over the Bandelets algorithm at 0.1 bpp. Although the algorithm is computationally intensive, its time complexity can be significantely reduced by collecting a ‘global ’ GW nterm approximation to the image from a collection of GW trees, each constructed separately over tiles of the image. 1
TaskOriented Lossy Compression of Magnetic Resonance Images
, 1996
"... A new taskoriented image quality metric is used to quantify the effects of distortion introduced into magnetic resonance images by lossy compression. This metric measures the similarity between a radiologist's manual segmentation of pathological features in the original images and the automate ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
A new taskoriented image quality metric is used to quantify the effects of distortion introduced into magnetic resonance images by lossy compression. This metric measures the similarity between a radiologist's manual segmentation of pathological features in the original images and the automated segmentations performed on the original and compressed images. The images are compressed using a general waveletbased lossy image compression technique, embedded zerotree coding, and segmented using a threedimensional stochastic modelbased tissue segmentation algorithm. The performance of the compression system is then enhanced by compressing different regions of the image volume at different bit rates, guided by prior knowledge about the location of important anatomical regions in the image. Application of the new system to magnetic resonance images is shown to produce compression results superior to the conventional methods, both subjectively and with respect to the segmentation similarity metric.