Results 1 
6 of
6
Lossless image compression by block matching on a mesh of trees
 in Proceedings IEEE Data Compression Conference, Poster Session
"... Abstract. Workoptimal O(log M log n) time implementations of lossless image compression by block matching are shown on the PRAM EREW, where n is the size of the image and M is the maximum size of the match, which can be implemented on practical architectures such as meshes of trees, pyramids and m ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Workoptimal O(log M log n) time implementations of lossless image compression by block matching are shown on the PRAM EREW, where n is the size of the image and M is the maximum size of the match, which can be implemented on practical architectures such as meshes of trees, pyramids and multigrids. The workoptimal implementations on pyramids and multigrids are possible under some realistic assumptions. Decompression on these architectures is also possible with the same parallel computational complexity.
A compressionboosting transform for twodimensional data
"... Abstract. We introduce a novel invertible transform for twodimensional data which has the objective of reordering the matrix so it will improve its (lossless) compression at later stages. The transform requires to solve a computationally hard problem for which a randomized algorithm is used. The in ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce a novel invertible transform for twodimensional data which has the objective of reordering the matrix so it will improve its (lossless) compression at later stages. The transform requires to solve a computationally hard problem for which a randomized algorithm is used. The inverse transform is fast and can be implemented in linear time in the size of the matrix. Preliminary experimental results show that the reordering improves the compressibility of digital images. 1
Speeding up Lossless Image Compression: Experimental Results on a Parallel Machine
"... Abstract. Arithmetic encoders enable the best compressors both for bilevel images (JBIG) and for grey scale and color images (CALIC), but they are often ruled out because too complex. The compression gap between simpler techniques and state of the art compressors can be significant. Storer extended ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Arithmetic encoders enable the best compressors both for bilevel images (JBIG) and for grey scale and color images (CALIC), but they are often ruled out because too complex. The compression gap between simpler techniques and state of the art compressors can be significant. Storer extended dictionary text compression to bilevel images to avoid arithmetic encoders (BLOCK MATCHING), achieving 70 percent of the compression of JBIG1 on the CCITT bilevel image test set. We were able to partition an image into up to a hundred areas and to apply the BLOCK MATCHING heuristic independently to each area with no loss of compression effectiveness. On the other hand, we presented in [5] a simple lossless compression heuristic for gray scale and color images (PALIC), which provides a highly parallelizable compressor and decompressor. In fact, it can be applied independently to each block of 8x8 pixels, achieving 80 percent of the compression obtained with LOCOI (JPEGLS), the current lossless standard in lowcomplexity applications. We experimented the BLOCK MATCHING and PALIC heuristics with up to 32 processors of a 256 Intel Xeon 3.06GHz processors machine in Italy (avogadro.cilea.it) on a test set of large topographic bilevel images and color images in RGB format. We obtained the expected speedup of the compression and decompression times, achieving parallel running times about twentyfive times faster than the sequential ones.
Binary Image Compression via Monochromatic Pattern Substitution: Effectiveness and Scalability
"... Abstract. We present a method for compressing binary images via monochromatic pattern substitution. Monochromatic rectangles inside the image are compressed by a variable length code. Such method has no relevant loss of compression effectiveness if the image is partitioned into up to a thousand bloc ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We present a method for compressing binary images via monochromatic pattern substitution. Monochromatic rectangles inside the image are compressed by a variable length code. Such method has no relevant loss of compression effectiveness if the image is partitioned into up to a thousand blocks and each block is compressed independently. Therefore, it can be implemented in parallel on both small and large scale arrays of processors with distributed memory and no interconnections. We experimented the procedure with up to 32 processors of a 256 Intel Xeon 3.06 GHz processors machine (avogadro.cilea.it) on a test set of large topographic bilevel images. We obtained the expected speedup of the compression and decompression times, achieving parallel running times about twenty times faster than the sequential ones. In the theoretical context of unbounded parallelism, we show experimentally that interprocessor communication is needed when we scale up the distributed system. It results that compression effectiveness has a bellshaped behaviour which is again competitive with the sequential performance when the highest degree of parallelism is reached.
Article Lempel–Ziv Data Compression on Parallel and Distributed Systems
, 2011
"... algorithms ..."
(Show Context)
Compressing BiLevel Images by Block Matching on a Tree Architecture
"... Abstract. A workoptimal O(log M log n) time parallel implementation of lossless image compression by block matching of bilevel images is shown on a full binary tree architecture under some realistic assumptions, where n is the size of the image and M is the maximum size of the match. Decompressio ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. A workoptimal O(log M log n) time parallel implementation of lossless image compression by block matching of bilevel images is shown on a full binary tree architecture under some realistic assumptions, where n is the size of the image and M is the maximum size of the match. Decompression on this architecture is also possible with the same parallel computational complexity. Such implementations have no scalability isuues.