## Complexity Distortion Theory (1997)

### Cached

### Download Links

- [www.ee.columbia.edu]
- [www.ctr.columbia.edu]
- [www.ctr.columbia.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | PROCEEDINGS 1997 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY |

Citations: | 21 - 2 self |

### BibTeX

@INPROCEEDINGS{Sow97complexitydistortion,

author = {Daby Sow and Alexandros Eleftheriadis},

title = {Complexity Distortion Theory},

booktitle = {PROCEEDINGS 1997 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY},

year = {1997},

pages = {604--608},

publisher = {}

}

### Years of Citing Articles

### OpenURL

### Abstract

We investigate the efficiency of lossy algorithmic representations of information and show that "Complexity Distortion" is asymptotically equivalent to Rate Distortion for stationary ergodic sources.

### Citations

8564 |
Elements of Information Theory
- Cover, Thomas
- 2006
(Show Context)
Citation Context ... proved that it is a lower bound for rates of -admissible codes He also proved the existence of codes achieving this lower bound. 1 These proofs revolve around the probabilistic concept of typicality =-=[2]-=-. In practice, stationary and ergodic assumptions are made on the source probability measure to design coding algorithms with performances close to the theoretical limits defined by Shannon. Stationar... |

6052 |
The Mathematical Theory of Communication
- Shannon, Weaver
- 1964
(Show Context)
Citation Context ...essing such representation problems was established in 1948, when Shannon introduced information and rate distortion theories. These theories ignore the meaning of the message considered “irrelevant” =-=[13]-=-. They are based on the measure theoretical concept of probability that was proposed by Kolmogorov in 1929. In this setting, source observations are produced by a source , in a probability space , wit... |

521 |
Three approaches to the quantitative definition of information
- Kolmogorov
- 1965
(Show Context)
Citation Context ...y and ergodic assumptions are difficult to justify and it becomes difficult to attach a physical meaning to the measure theoretical concept of probability [6]. Interestingly, Kolmogorov introduced in =-=[8]-=- the notion of Kolmogorov complexity to measure the amount of randomness in individual objects. He found the need to measure it from 1 See [1] for precise definitions and statements. (2) (3) 0018-9448... |

498 |
Stochastic Complexity
- Rissanen
- 1989
(Show Context)
Citation Context ...proximations are obtained by adding computational resource bounds on the decoding UTM [16]. Similarly, we are not aware of any systematic way to estimate source distributions from finite observations =-=[12]-=-. Jeffrey [6] takes this observation even further and asserts that the concept of probability has no physical meaning. These observations did not prevent the field of source coding to blossom with the... |

330 | A theory of program size formally identical to information theory
- Chaitin
(Show Context)
Citation Context ...s to provide a much broader and unifying perspective on media representation. The key component of this theory is the substitution of Shannon's classical communication system model by Chaitin's model =-=[1]-=-, where the decoder is a universal Turing machine and the codewords are programs for such a computer. In this paper, we focus on optimal representations using the Algorithmic-Kolmogorov complexity and... |

314 |
Rate Distortion Theory: A Mathematical Basis For Data Compression
- Berger
- 1971
(Show Context)
Citation Context ... probability [6]. Interestingly, Kolmogorov introduced in [8] the notion of Kolmogorov complexity to measure the amount of randomness in individual objects. He found the need to measure it from 1 See =-=[1]-=- for precise definitions and statements. (2) (3) 0018-9448/03$17.00 © 2003 IEEESOW AND ELEFTHERIADIS: COMPLEXITY DISTORTION THEORY 605 lengths of descriptions of objects on a universal Turing machine... |

227 | On the length of programs for computing finite binary sequences - Chaitin |

184 | The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms
- Zvonkin, Levin
- 1970
(Show Context)
Citation Context ...ion are equivalent for a very wide class of information sources. In the lossless setting, Zvonkin and Levin were the first one to propose equivalences between Kolmogorov complexity and the entropy in =-=[21]-=-. The following theorem is formulated without a proof in [21]. Theorem 1: For a stationary ergodic source , with a probability measure where denotes the entropy rate of the source. (4) (5) (6) (7) -a.... |

114 |
Computability and Unsolvability
- Davis
- 1958
(Show Context)
Citation Context ...iously, if , . But in practice it is more common to have . From now on, we always assume that such a real number exists and that .If , and if , the code is noiseless or faithful. Let be a union 2 See =-=[3]-=- for a precise definition of computable or recursive functions. of -balls covering . is called a -cover of . Let denote the minimum number of -balls needed to cover . By definition, the operational RD... |

66 | Minimum description length induction, Bayesianism, and Kolmogorov complexity
- Vitányi, Li
(Show Context)
Citation Context ...he approximation of the source distribution from time averages obtained from a single infinite source observation. Unfortunately, in many practical situations, models are not inherently probabilistic =-=[17]-=- and source observations are finite. For example, in image representation, source observations usually contain a significant amount of spatial regularities that escape all probabilistic models. Furthe... |

46 | A formal theory of inductive inference, part 1 and part 2 - Solomonoff - 1964 |

30 | Some equivalences between Shannon entropy and Kolmogorov complexity
- Leung-Yan-Cheong, Cover
(Show Context)
Citation Context ..., namely, the complexity distortion function (CDF) and state the conditions for the equivalence between the CDF and the RDF. General equivalences have been proposed in the lossless and lossy cases in =-=[9]-=-, [11], [19], and [20]. In this paper, we simplify these results significantly and extend them with pragmatic considerations for the coding of finite objects. A formal proof of the equivalence is prop... |

26 | Sample converses in source coding theory - Kieffer - 1991 |

16 | Flavor: A Language for Media Representation - Eleftheriadis - 1997 |

14 |
Distortion program-size complexity with respect to a fidelity criterion and rate-distortion function
- Yang, Shen
(Show Context)
Citation Context ...storted objects. We denote this function D. Definition 1 The complexity distortion function is: C(D) = lim n!1 E[ K(D(xn)) n ] = lim n!1 E[ K(xn) \Gamma K(xn j Dxn) n ] 1 Similar results are given in =-=[3]-=- but in a less constructive way. III. Equivalence between R(D) and C(D) Theorem 1 For ergodic sources, R(D) = C(D) = limn!1 K(xn )\GammaK(x njD(xn )) n , with probability 1, R(D) being the classical r... |

11 | A Syntactic Framework for Bitstream-Level Representation of Audio-Visual Objects - Fang, Eleftheriadis - 1996 |

11 |
Distortion-complexity and rate-distortion function
- Muramatsu, Kanaya
(Show Context)
Citation Context ...ely, the complexity distortion function (CDF) and state the conditions for the equivalence between the CDF and the RDF. General equivalences have been proposed in the lossless and lossy cases in [9], =-=[11]-=-, [19], and [20]. In this paper, we simplify these results significantly and extend them with pragmatic considerations for the coding of finite objects. A formal proof of the equivalence is proposed i... |

10 | Structured audio, Kolmogorov complexity, and generalized audio coding
- Scheirer
- 2001
(Show Context)
Citation Context ... of practical media representation systems. In practice, programmatic representation techniques are already starting to gain momentum in audio representation with the MPEG-4 Structured Audio standard =-=[4]-=-. It is shown in [4] that such programmatic representations of sound outperform today’s probabilistic audio representation schemes. The last point that we would like to emphasize is that Shannon’s com... |

8 |
Nonblock source coding with a fidelity criterion, Annals of Probability 3(3
- GRAY, NEUHOFF, et al.
- 1975
(Show Context)
Citation Context ...er than . The intuition behind this statement is that each element of can be represented “ -semifaithfully” by the index of the -ball containing the element. Gray, Neuhoff, and Ornstein have shown in =-=[5]-=- that this definition of the operational RDF is equivalent to the definition of the RDF [1]. From now on, we drop the subscript on it and represent it with . The CDF is introduced in a similar manner.... |

6 | Back from Infinity: A Constrained Resources Approach to Information Theory - Ziv - 1998 |

4 |
Universal almost sure data compression using Markov types,” Probl
- Shields
- 1990
(Show Context)
Citation Context ...ble by a UTM or not, can have a rate below the RDF. To accomplish the first step, we prove the following lemma. Lemma 1: -a.s. (10) Proof: We use the concept of Markov types as proposed by Shields in =-=[14]-=- to design universal codes that can be decoded by a UTM. Let . Following [14], the Markov -type is defined by sliding a window of length along and counting frequencies. These frequencies are then used... |

4 |
The proof of Levin’s conjecture
- Yang
(Show Context)
Citation Context ...he complexity distortion function (CDF) and state the conditions for the equivalence between the CDF and the RDF. General equivalences have been proposed in the lossless and lossy cases in [9], [11], =-=[19]-=-, and [20]. In this paper, we simplify these results significantly and extend them with pragmatic considerations for the coding of finite objects. A formal proof of the equivalence is proposed in Sect... |

2 |
An Introduction to Kolmogorov Complexity and its Apllications, 2nd ed
- Li, Vitanyi
- 1997
(Show Context)
Citation Context ...We end this paper with three key points on the scope of CDT. First, restricting the decoding function to be computable does not reduce the performances of the system. In fact the Church–Turing thesis =-=[10]-=- guarantees that any coding technique belongs to the set of computable functions, from traditional entropy approaches to modern approaches like fractal and model-based coding. These modern techniques ... |

2 | Visual Communication and Image Processing, chapter MPEG-4: An Object-Based Multimedia Coding Standard - Puri, Eleftheriadis - 1998 |

2 | Algorithmic representation of visual information - Sow, Eleftheriadis - 1997 |

2 |
Mises redux,” in Basic Problems in Methodology and
- Jeffrey
- 1977
(Show Context)
Citation Context ... averages. In these situations, the stationary and ergodic assumptions are difficult to justify and it becomes difficult to attach a physical meaning to the measure theoretical concept of probability =-=[6]-=-. Interestingly, Kolmogorov introduced in [8] the notion of Kolmogorov complexity to measure the amount of randomness in individual objects. He found the need to measure it from 1 See [1] for precise ... |