## Statistical Models for Images: Compression, Restoration and Synthesis (1997)

### Cached

### Download Links

Venue: | In 31st Asilomar Conf on Signals, Systems and Computers |

Citations: | 138 - 33 self |

### BibTeX

@INPROCEEDINGS{Simoncelli97statisticalmodels,

author = {Eero Simoncelli},

title = {Statistical Models for Images: Compression, Restoration and Synthesis},

booktitle = {In 31st Asilomar Conf on Signals, Systems and Computers},

year = {1997},

pages = {673--678},

publisher = {IEEE Computer Society}

}

### Years of Citing Articles

### OpenURL

### Abstract

this paper, we examine the problem of decomposing digitized images, through linear and/or nonlinear transformations, into statistically independent components. The classical approach to such a problem is Principal Components Analysis (PCA), also known as the Karhunen-Loeve (KL) or Hotelling transform. This is a linear transform that removes second-order dependencies between input pixels. The most well-known description of image statistics is that their power spectra take the form of a power law [e.g., 20, 11, 24]. Coupled with a constraint of translationinvariance, this suggests that the Fourier transform is an appropriate PCA representation. Fourier and related representations are widely used in image processing applications.

### Citations

3719 | Stochastic Relaxation, Gibbs Distributions and the Bayesian Restoration of Images - Geman, Geman - 1984 |

2373 | A Theory of Multiresolution Signal Decomposition: The Wavelet Representation
- Mallat
- 1989
(Show Context)
Citation Context ...ted kurtoses of all of the subbands are significantly larger than the value of three that characterizes a Gaussian distribution. Also shown in figure 1 are two-parameter density functions of the form =-=[17, 28]-=-: P(c) / e \Gammajc=sj p : (1) The density parameters fs; pg are estimated by minimizing the relative entropy (the Kullback-Leibler divergence) between a discretized model distribution and the 256-bin... |

1248 |
Embedded image coding using zerotrees of wavelet coefficients
- Shapiro
- 1993
(Show Context)
Citation Context ...r image compression. Although many image coders do not incorporate an explicit probability model, a number of recent algorithms make use of joint statistical regularities between wavelet coefficients =-=[19, 27, 22, 26, 15, 25, 6, 32, 16]-=-. We have constructed two coders called EPWIC [4, 29, 3] based directly on the probability models described in sections 1 and 2. In both coders, subband coefficients are encoded one bitplane at a time... |

605 | Relations between the statistics of natural images and the response properties of cortical cells
- Field
- 1987
(Show Context)
Citation Context ...tain smooth areas interspersed with occasional sharp transitions (e.g., edges). The smooth regions produce smallamplitude coefficients, and the transitions produce sparse large-amplitude coefficients =-=[11]-=-. To quantify this, we give the sample kurtosis (fourth moment divided by squared second moment) below each histogram. The estimated kurtoses of all of the subbands are significantly larger than the v... |

386 | Pyramid-based texture analysis/synthesis
- Heeger, Bergeny
- 1995
(Show Context)
Citation Context ...elds [e.g., 8, 676 12]. Recent techniques have been developed to synthesize images with the same wavelet coefficient marginal statistics as those of an example image. In particular, Heeger and Bergen =-=[13]-=- used an overcomplete basis, and iteratively alternated between matching the subband histograms, and matching the pixel histogram. Zhu et. al. [33] used Gibbs sampling to draw from the maximal-entropy... |

278 | Markov Random Field Texture Models - Cross, Jain - 1983 |

232 | The steerable pyramid: A flexible architecture for multiscale derivative computation", pp 444–447
- Simoncelli, Freeman
- 1995
(Show Context)
Citation Context ...'ve been developing a semi-blind noise-removal algorithm based on the joint statistical model of section 2. We use an overcomplete tight frame representation with four oriented subbands at each scale =-=[30]-=-. We use a causal neighborhood of size 18 and bootstrap to estimate model parameters. For each subbands, from coarse to fine, we: 1. Adjust subband variances: C 0 = r E(C 2 )\Gammaoe 2 E(C 2 ) \Delta ... |

203 | Noise removal via Bayesian wavelet coring”, the in proceeding
- Simoncelli, Adelson
- 1996
(Show Context)
Citation Context ...ted kurtoses of all of the subbands are significantly larger than the value of three that characterizes a Gaussian distribution. Also shown in figure 1 are two-parameter density functions of the form =-=[17, 28]-=-: P(c) / e \Gammajc=sj p : (1) The density parameters fs; pg are estimated by minimizing the relative entropy (the Kullback-Leibler divergence) between a discretized model distribution and the 256-bin... |

195 | Filters, random fields and maximum entropy (FRAME): Towards a unified theory for texture modeling
- Zhu, Wu, et al.
- 1998
(Show Context)
Citation Context ...f an example image. In particular, Heeger and Bergen [13] used an overcomplete basis, and iteratively alternated between matching the subband histograms, and matching the pixel histogram. Zhu et. al. =-=[33]-=- used Gibbs sampling to draw from the maximal-entropy distribution with the same marginals as the example image. This technique may be applied to the marginals of linear or non-linear operators. But i... |

194 | Image compression via joint statistical characterization in the wavelet domain
- Buccigrossi, Simoncelli
- 1999
(Show Context)
Citation Context ...this region. The form of the histogram shown in figure 3A also holds for pairs of coefficients at adjacent spatial locations and orientations, and is surprisingly robust across a wide range of images =-=[3]-=-. Given the linear relationship between large-amplitude coefficients and the difficulty of characterizing the full density of a coefficient conditioned on its neighbors, we've examined a linear predic... |

172 | An Image Multiresolution Representation for Lossless and Lossy Compression
- Said, Peadman
- 1996
(Show Context)
Citation Context ...r image compression. Although many image coders do not incorporate an explicit probability model, a number of recent algorithms make use of joint statistical regularities between wavelet coefficients =-=[19, 27, 22, 26, 15, 25, 6, 32, 16]-=-. We have constructed two coders called EPWIC [4, 29, 3] based directly on the probability models described in sections 1 and 2. In both coders, subband coefficients are encoded one bitplane at a time... |

159 | Statistics of natural images: Scaling in the woods - L, BIALEK - 1994 |

81 | Implementation of compression with reversible embedded wavelets
- Schwartz, Zandi, et al.
- 1995
(Show Context)
Citation Context ...r image compression. Although many image coders do not incorporate an explicit probability model, a number of recent algorithms make use of joint statistical regularities between wavelet coefficients =-=[19, 27, 22, 26, 15, 25, 6, 32, 16]-=-. We have constructed two coders called EPWIC [4, 29, 3] based directly on the probability models described in sections 1 and 2. In both coders, subband coefficients are encoded one bitplane at a time... |

70 | Drift-balanced random stimuli: A general basis for studying non-Fourier motion perception. J Opt Soc Am A 5:1986–2007 - Chubb, Sperling - 1988 |

70 | A non-parametric multi-scale statistical model for natural images
- Bonet, Viola
- 1997
(Show Context)
Citation Context ...pat and Picard [21] have developed a probability model for densities of local coefficient clusters (including those at different scales), and used it to synthesize texture examples. DeBonet and Viola =-=[9]-=- describe a fast heuristic synthesis technique which captures joint relationships across scale. We've been working to develop a synthesis algorithm 3 based on the joint statistical observations of sec... |

69 | Early vision and texture perception - Bergen, Adelson - 1988 |

49 | E cient Context-Based Entropy Coding for Lossy Wavelet Image Compression
- s, Ortega
- 1997
(Show Context)
Citation Context |

49 | Cluster-based Probability Model and its Application to Image and Texture Processing
- Popat, Picard
- 1997
(Show Context)
Citation Context ...nalysis appears often in the human vision literature in the form of "second-order" texture models [e.g., 2, 7]. Recent nonlinear joint models have given impressive synthesis results. Popat a=-=nd Picard [21]-=- have developed a probability model for densities of local coefficient clusters (including those at different scales), and used it to synthesize texture examples. DeBonet and Viola [9] describe a fast... |

44 | Combining frequency and spatial domain information for fast interactive image noise removal - Hirani, Totsuka - 1996 |

41 | A model of neuronal responses in visual area mt - Simoncelli, Heeger - 1997 |

31 | Image coding by block prediction of multiresolution subimages,” Universita di Padova
- Rinaldo, Calvagno
- 1995
(Show Context)
Citation Context |

28 |
Pyramid-based texture analysis/ synthesis
- Heeger, Bergen
- 1995
(Show Context)
Citation Context ...m Fields [e.g., 8, 12]. Recent techniques have been developed to synthesize images with the same wavelet coefficient marginal statistics as those of an example image. In particular, Heeger and Bergen =-=[13]-=- used an overcomplete basis, and iteratively alternated between matching the subband histograms, and matching the pixel histogram. Zhu et. al. [33] used Gibbs sampling to draw from the maximal-entropy... |

24 |
A practical approach to fractal-based image compression
- Pentland, Horowitz
- 1993
(Show Context)
Citation Context |

24 | Fractal based Description of Natural Scenes - Pentland - 1984 |

22 | Progressive wavelet image coding based on a conditional probability model
- Buccigrossi, Simoncelli
- 1997
(Show Context)
Citation Context ...obability model, a number of recent algorithms make use of joint statistical regularities between wavelet coefficients [19, 27, 22, 26, 15, 25, 6, 32, 16]. We have constructed two coders called EPWIC =-=[4, 29, 3]-=- based directly on the probability models described in sections 1 and 2. In both coders, subband coefficients are encoded one bitplane at a time using a non-adaptive arithmetic encoder that utilizes p... |

14 | Embedded wavelet image compression based on a joint probability model
- Simoncelli, Buccigrossi
(Show Context)
Citation Context ...obability model, a number of recent algorithms make use of joint statistical regularities between wavelet coefficients [19, 27, 22, 26, 15, 25, 6, 32, 16]. We have constructed two coders called EPWIC =-=[4, 29, 3]-=- based directly on the probability models described in sections 1 and 2. In both coders, subband coefficients are encoded one bitplane at a time using a non-adaptive arithmetic encoder that utilizes p... |

13 | A method for the digital enhancement of unsharp, grainy photographic images - Bayer, Powell - 1986 |

13 |
Context modeling and entropy coding of wavelet coefficients for image compression
- Wu, Chen
- 1997
(Show Context)
Citation Context |

10 |
Wavelet image coding based on a new generalized Gaussian mixture model
- LoPresto, Ramchandran, et al.
- 1997
(Show Context)
Citation Context |

4 |
A Jointly Optimized Subband Coder
- Kossentini, Chung, et al.
- 1996
(Show Context)
Citation Context |

4 | Computer simulations of oriented multiple spatial frequency band coring - Ogden, Adelson - 1985 |

3 | Improved system for coring an image representing signal - Carlson, Adelson, et al. - 1985 |

2 | Adapting to unknown smootness via wavelet shrinkage - Donoho, Johnstone - 1995 |