Results 1  10
of
49
Progressive Geometry Compression
, 2000
"... We propose a new progressive compression scheme for arbitrary topology, highly detailed and densely sampled meshes arising from geometry scanning. We observe that meshes consist of three distinct components: geometry, parameter, and connectivity information. The latter two do not contribute to the r ..."
Abstract

Cited by 244 (16 self)
 Add to MetaCart
We propose a new progressive compression scheme for arbitrary topology, highly detailed and densely sampled meshes arising from geometry scanning. We observe that meshes consist of three distinct components: geometry, parameter, and connectivity information. The latter two do not contribute to the reduction of error in a compression setting. Using semiregular meshes, parameter and connectivity information can be virtually eliminated. Coupled with semiregular wavelet transforms, zerotree coding, and subdivision based reconstruction we see improvements in error by a factor four (12dB) compared to other progressive coding schemes. CR Categories and Subject Descriptors: I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling  hierarchy and geometric transformations; G.1.2 [Numerical Analysis]: Approximation  approximation of surfaces and contours, wavelets and fractals; I.4.2 [Image Processing and Computer Vision]: Compression (Coding)  Approximate methods Additional K...
ForWaRD: FourierWavelet Regularized Deconvolution for IllConditioned Systems
 IEEE Trans. on Signal Processing
, 2002
"... We propose an efficient, hybrid FourierWavelet Regularized Deconvolution (ForWaRD) al gorithm that performs noise regularization via scalar shrinkage in both the Fourier and wavelet domains. The Fourier shrinkage exploits the Fourier transform's sparse representation of the colored noise i ..."
Abstract

Cited by 112 (2 self)
 Add to MetaCart
(Show Context)
We propose an efficient, hybrid FourierWavelet Regularized Deconvolution (ForWaRD) al gorithm that performs noise regularization via scalar shrinkage in both the Fourier and wavelet domains. The Fourier shrinkage exploits the Fourier transform's sparse representation of the colored noise inherent in deconvolution, while the wavelet shrinkage exploits the wavelet do main's sparse representation of piecewise smooth signals and images. We derive the optimal balance between the amount of Fourier and wavelet regularization by optimizing an approxi mate meansquarederror (MSE) metric and find that signals with sparser wavelet representa tions require less Fourier shrinkage. ForWaRD is applicable to all illconditioned deconvolution problems, unlike the purely waveletbased Wavelet Vaguelette Deconvolution (WVD), and its es timate features minimal ringing, unlike purely Fourierbased Wiener deconvolution. We analyze ForWaRD's MSE decay rate as the number of samples increases and demonstrate its improved performance compared to the optimal WVD over a wide range of practical samplelengths.
Automatic Writer Identification Using ConnectedComponent Contours And . . .
, 2004
"... In this paper, a new technique for offline writer identification is presented, using connectedcomponent contours (COCOCOs or CO³s) in uppercase handwritten samples. In our model, the writer is considered to be characterized by a stochastic pattern generator, producing a family of connected compon ..."
Abstract

Cited by 45 (9 self)
 Add to MetaCart
(Show Context)
In this paper, a new technique for offline writer identification is presented, using connectedcomponent contours (COCOCOs or CO³s) in uppercase handwritten samples. In our model, the writer is considered to be characterized by a stochastic pattern generator, producing a family of connected components for the uppercase character set. Using a codebook of CO³s from an independent training set of 100 writers, the probabilitydensity function (PDF) of CO³s was computed for an independent test set containing 150 unseen writers. Results revealed a highsensitivity of the CO³ PDF for identifying individual writers on the basis of a single sentence of uppercase characters. The proposed automatic approach bridges the gap between imagestatistics approaches on one end and manually measured allograph features of individual characters on the other end. Combining the CO³ PDF with an independent edgebased orientation and curvature PDF yielded very high correct identification rates.
Image Compression by Linear Splines over Adaptive Triangulations
"... This paper proposes a new method for image compression. The method is based on the approximation of an image, regarded as a function, by a linear spline over an adapted triangulation, D(Y ), which is the Delaunay triangulation of a small set Y of significant pixels. The linear spline minimizes the d ..."
Abstract

Cited by 42 (9 self)
 Add to MetaCart
(Show Context)
This paper proposes a new method for image compression. The method is based on the approximation of an image, regarded as a function, by a linear spline over an adapted triangulation, D(Y ), which is the Delaunay triangulation of a small set Y of significant pixels. The linear spline minimizes the distance to the image, measured by the mean square error, among all linear splines over D(Y ). The significant pixels in Y are selected by an adaptive thinning algorithm, which recursively removes less significant pixels in a greedy way, using a sophisticated criterion for measuring the significance of a pixel. The proposed compression method combines the approximation scheme with a customized scattered data coding scheme. We demonstrate that our compression method outperforms JPEG2000 on two geometric images and performs competitively with JPEG2000 on three popular test cases of real images.
Progressive Transmission of Vector Map Data over the World Wide Web
, 2001
"... Within distributed computing environments, access to very large geospatial datasets often suffers from slow or unreliable network connections. To allow users to start working with a partially delivered dataset, progressive transmission methods are a viable solution. While incremental and progressive ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
Within distributed computing environments, access to very large geospatial datasets often suffers from slow or unreliable network connections. To allow users to start working with a partially delivered dataset, progressive transmission methods are a viable solution. While incremental and progressive methods have been applied successfully to the transmission of raster images over the World Wide Web, and, in the form of prototypes, of triangular meshes, the transmission of vector map datasets has lacked a similar attention. This paper introduces a solution to the progressive transmission of vector map data that allows users to apply analytical GIS methods to partially transmitted data sets. The architecture follows a clientserver model with multiple map representations at the server side, and a thin client that compiles transmitted increments into a topologically consistent format. This paper describes the concepts, develops an architecture, and discusses implementation concerns.
A Panorama on Multiscale Geometric Representations, Intertwining Spatial, Directional and Frequency Selectivity
, 2011
"... The richness of natural images makes the quest for optimal representations in image processing and computer vision challenging. The latter observation has not prevented the design of image representations, which trade off between efficiency and complexity, while achieving accurate rendering of smoot ..."
Abstract

Cited by 21 (8 self)
 Add to MetaCart
(Show Context)
The richness of natural images makes the quest for optimal representations in image processing and computer vision challenging. The latter observation has not prevented the design of image representations, which trade off between efficiency and complexity, while achieving accurate rendering of smooth regions as well as reproducing faithful contours and textures. The most recent ones, proposed in the past decade, share an hybrid heritage highlighting the multiscale and oriented nature of edges and patterns in images. This paper presents a panorama of the aforementioned literature on decompositions in multiscale, multiorientation bases or dictionaries. They typically exhibit redundancy to improve sparsity in the transformed domain and sometimes its invariance with respect to simple geometric deformations (translation, rotation). Oriented multiscale dictionaries extend traditional wavelet processing and may offer rotation invariance. Highly redundant dictionaries require specific algorithms to simplify the search for an efficient (sparse) representation. We also discuss the extension of multiscale geometric decompositions to nonEuclidean domains such as the sphere or arbitrary meshed surfaces. The etymology of panorama suggests an overview, based on a choice of partially overlapping “pictures”.
Enhancement of JPEGcompressed images by reapplication of JPEG
 Journal of VLSI Signal Processing
"... Abstract. A novel method is proposed for postprocessing of JPEGencoded images, in order to reduce coding artifacts and enhance visual quality. Our method simply reapplies JPEG to the shifted versions of the alreadycompressed image, and forms an average. This approach, despite its simplicity, off ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
(Show Context)
Abstract. A novel method is proposed for postprocessing of JPEGencoded images, in order to reduce coding artifacts and enhance visual quality. Our method simply reapplies JPEG to the shifted versions of the alreadycompressed image, and forms an average. This approach, despite its simplicity, offers better performance than other known methods, including those based on nonlinear filtering, POCS, and redundant wavelets.
Adaptive thinning for terrain modelling and image compression
 in Advances in Multiresolution for Geometric Modelling
, 2004
"... Summary. Adaptive thinning algorithms are greedy point removal schemes for bivariate scattered data sets with corresponding function values, where the points are recursively removed according to some datadependent criterion. Each subset of points, together with its function values, defines a linear ..."
Abstract

Cited by 15 (8 self)
 Add to MetaCart
(Show Context)
Summary. Adaptive thinning algorithms are greedy point removal schemes for bivariate scattered data sets with corresponding function values, where the points are recursively removed according to some datadependent criterion. Each subset of points, together with its function values, defines a linear spline over its Delaunay triangulation. The basic criterion for the removal of the next point is to minimize the error between the resulting linear spline at the bivariate data points and the original function values. This leads to a hierarchy of linear splines of coarser and coarser resolutions. This paper surveys the various removal strategies developed in our earlier papers, and the application of adaptive thinning to terrain modelling and to image compression. In our image test examples, we found that our thinning scheme, adapted to diminish the least squares error, combined with a postprocessing least squares optimization and a customized coding scheme, often gives better or comparable results to the waveletbased scheme SPIHT. 1
Global Optimization For Constrained Nonlinear Programming
, 2001
"... In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for constrained local minima (CLM dn ) in the theory of discrete constrained optimization using Lagrange multipliers developed in our group. The theory proves the equivalence between the set of discrete saddle points and the set of CLM dn, leading to the firstorder necessary and sufficient condition for CLM dn. To find
Effective image compression using evolved wavelets
 in GECCO ’05: Proceedings of the 2005 conference on Genetic and evolutionary computation
, 2005
"... Waveletbased image coders like the JPEG2000 standard are the state of the art in image compression. Unlike traditional image coders, however, their performance depends to a large degree on the choice of a good wavelet. Most waveletbased image coders use standard wavelets that are known to perform ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Waveletbased image coders like the JPEG2000 standard are the state of the art in image compression. Unlike traditional image coders, however, their performance depends to a large degree on the choice of a good wavelet. Most waveletbased image coders use standard wavelets that are known to perform well on photographic images. However, these wavelets do not perform as well on other common image classes, like scanned documents or fingerprints. In this paper, a method based on the coevolutionary genetic algorithm introduced in [11] is used to evolve specialized wavelets for fingerprint images. These wavelets are compared to the handdesigned wavelet currently used by the FBI to compress fingerprints. The results show that the evolved wavelets consistently outperform the handdesigned wavelet. Using evolution to adapt wavelets to classes of images can therefore significantly increase the quality of compressed images.