Results 1  10
of
23
Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity
 IEEE Transactions on Information Theory
, 1998
"... The relationship between the Bayesian approach and the minimum description length approach is established. We sharpen and clarify the general modeling principles MDL and MML, abstracted as the ideal MDL principle and defined from Bayes's rule by means of Kolmogorov complexity. The basic conditi ..."
Abstract

Cited by 67 (7 self)
 Add to MetaCart
The relationship between the Bayesian approach and the minimum description length approach is established. We sharpen and clarify the general modeling principles MDL and MML, abstracted as the ideal MDL principle and defined from Bayes's rule by means of Kolmogorov complexity. The basic condition under which the ideal principle should be applied is encapsulated as the Fundamental Inequality, which in broad terms states that the principle is valid when the data are random, relative to every contemplated hypothesis and also these hypotheses are random relative to the (universal) prior. Basically, the ideal principle states that the prior probability associated with the hypothesis should be given by the algorithmic universal probability, and the sum of the log universal probability of the model plus the log of the probability of the data given the model should be minimized. If we restrict the model class to the finite sets then application of the ideal principle turns into Kolmogorov's mi...
NextGeneration Content Representation, Creation and Searching for New Media Applications in Education
, 1998
"... Content creation, editing, and searching are extremely time consuming tasks that often require substantial training and experience, especially when highquality audio and video are involved. "New media" represents a new paradigm for multimedia information representation and processing, in ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
Content creation, editing, and searching are extremely time consuming tasks that often require substantial training and experience, especially when highquality audio and video are involved. "New media" represents a new paradigm for multimedia information representation and processing, in which the emphasis is placed on the actual content. It thus brings the tasks of content creation and searching much closer to actual users and enables them to be active producers of audiovisual information rather than passive recipients. We discuss the stateoftheart and present nextgeneration techniques for content representation, searching, creation, and editing. We discuss our experiences in developing a Webbased distributed compressed video editing and searching system (WebClip), a media representation language (Flavor) and an objectbased video authoring system (Zest) based on it, and large image/video search engines for the WorldWide Web (WebSEEk and VideoQ). We also present a case study of new media applications based on specific planned multimedia education experiments with the above systems in several K12 schools in Manhattan.
RateDistortionComplexity Modeling for Network and Receiver Aware Adaptation
 IEEE Transactions on Multimedia
, 2005
"... Existing research on Universal Multimedia Access has mainly focused on adapting multimedia to the network characteristics while overlooking the receiver capabilities. Alternatively, part 7 of the MPEG21 standard entitled Digital Item Adaptation (DIA) defines description tools to guide the multimedi ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
Existing research on Universal Multimedia Access has mainly focused on adapting multimedia to the network characteristics while overlooking the receiver capabilities. Alternatively, part 7 of the MPEG21 standard entitled Digital Item Adaptation (DIA) defines description tools to guide the multimedia adaptation process based on both the network conditions and the available receiver resources. In this paper, we propose a new and generic ratedistortioncomplexity model that can generate such DIA descriptions for image and video decoding algorithms running on various hardware architectures. The novelty of our approach is in virtualizing complexity, i.e., we explicitly model the complexity involved in decoding a bitstream by a generic receiver. This generic complexity is translated dynamically into "real" complexity, which is architecturespecific. The receivers can then negotiate with the media server/proxy the transmission of a bitstream having a desired complexity level based on their resource constraints. Hence, unlike in previous streaming systems, multimedia transmission can be optimized in an integrated ratedistortioncomplexity setting by minimizing the incurred distortion under joint ratecomplexity constraints.
Rate distortion and denoising of individual data using Kolmogorov complexity
 IEEE Trans. Inform. Theory
, 2010
"... Abstract—We examine the structure of families of distortion balls from the perspective of Kolmogorov complexity. Special attention is paid to the canonical ratedistortion function of a source word which returns the minimal Kolmogorov complexity of all distortion balls containing that word subject t ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Abstract—We examine the structure of families of distortion balls from the perspective of Kolmogorov complexity. Special attention is paid to the canonical ratedistortion function of a source word which returns the minimal Kolmogorov complexity of all distortion balls containing that word subject to a bound on their cardinality. This canonical ratedistortion function is related to the more standard algorithmic ratedistortion function for the given distortion measure. Examples are given of list distortion, Hamming distortion, and Euclidean distortion. The algorithmic ratedistortion function can behave differently from Shannon’s ratedistortion function. To this end, we show that the canonical ratedistortion function can and does assume a wide class of shapes (unlike Shannon’s); we relate low algorithmic mutual information to low Kolmogorov complexity (and consequently suggest that certain aspects of the mutual information formulation of Shannon’s ratedistortion function behave differently than would an analogous formulation using algorithmic mutual information); we explore the notion that low Kolmogorov complexity distortion balls containing a given word capture the interesting properties of that word (which is hard to formalize in Shannon’s theory) and this suggests an approach to denoising. Index Terms—Algorithmic rate distortion, characterization, denoising, distortion families, fitness of destination words, individual data, Kolmogorov complexity, rate distortion, shapes of curves. word are relevant in the setting at hand, and which aspects are irrelevant (such as noise). For example, in application to lossy compression of a sound file this results in a compressed file where, among others, the very high and very low inaudible frequencies have been suppressed. The distortion measure is chosen such that it penalizes the deletion of the inaudible frequencies but lightly because they are not relevant for the auditory experience. We study rate distortion of individual source words using Kolmogorov complexity and show how it is related to denoising. The classical probabilistic theory is reviewed in Appendix A. Computability notions are reviewed
A Kolmogorov Complexitybased Genetic Programming Tool for String Compression
 in Proceedings of the Genetic and Evolutionary Computation Conference (GECCO2000), Darrell Whitley, David Goldberg, Erick CantuPaz, Lee Spector, Ian Parmee, and HansGeorg Beyer, Eds., Las Vegas
, 2000
"... By following the guidelines set in one of our previous papers, in this paper we face the problem of Kolmogorov complexity estimate for binary strings by making use of a Genetic Programming approach. This consists in evolving a population of Lisp programs looking for the "optimal" program t ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
By following the guidelines set in one of our previous papers, in this paper we face the problem of Kolmogorov complexity estimate for binary strings by making use of a Genetic Programming approach. This consists in evolving a population of Lisp programs looking for the "optimal" program that generates a given string. By taking into account several target binary strings belonging to different formal languages, we show the effectiveness of our approach in obtaining an approximation from the above of the Kolmogorov complexity function. Moreover, the adequate choice of "similar" target strings allows our system to show very interesting computational strategies. Experimental results indicate that our tool achieves promising compression rates for binary strings belonging to formal languages. Furthermore, even for more complicated strings our method can work, provided that some degree of loss is accepted. These results constitute a first step in using Kolmogorov complexit...
Satellite Image Artifacts Detection Based on Complexity distortion Theory
 IEEE International Geosciences and Remote Sensing Symposium – IGARSS 2011, Vancouver
, 2011
"... The artifacts detection is a step of data cleaning process. The classical approach is to predict or determine the existence of defects, to model it, and then design a method to detect and correct them. This classical approach is for specific artifacts. The approach presented in this work is using co ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
The artifacts detection is a step of data cleaning process. The classical approach is to predict or determine the existence of defects, to model it, and then design a method to detect and correct them. This classical approach is for specific artifacts. The approach presented in this work is using complexity distortion theory to implement a more generic method, thus, this work will aim at developing parameter free methods able to automatically detect artifacts in EO images. We use the Kolmogorov Structure Function as approximation to the Ratedistortion curve and examine how the artifacts can have the same structure.
Representing Information with Computational Resource Bounds
 in Proc. Asilomar Conf. Signals, Systems, Computers, Asilomar, CA
, 1998
"... A general framework for data compression, in which computational resource bounds are introduced at both the encoding and decoding end, is presented. We move away from Shannon's traditional communication system by introducing some structure at the decoder and model it by a Turing machine with fi ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
A general framework for data compression, in which computational resource bounds are introduced at both the encoding and decoding end, is presented. We move away from Shannon's traditional communication system by introducing some structure at the decoder and model it by a Turing machine with finite computational resources. Information is measured using the resource bounded Kolmogorov complexity. In this setting, we investigate the design of efficient lossy encoders. 1 Introduction The problems of quantifying representing and transmitting information have been addressed in 1948 by C. E. Shannon [10] in a pure communication setting. The result of this work is Information Theory (IT), a mathematical basis formalizing the communication problem between a sender and a receiver. In this framework, the meaning of the message is irrelevant and completely ignored. The main question is to find an efficient representation of the output of a stochastic information source. This representation mus...
Ratedistortion theory for individual data
, 2004
"... Abstract — We develop ratedistortion theory for individual data with respect to general distortion measures, that is, a theory of lossy compression of individual data. This is applied to Euclidean distortion, Hamming distortion, Kolmogorov distortion, and ShannonFano distortion. We show that in al ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract — We develop ratedistortion theory for individual data with respect to general distortion measures, that is, a theory of lossy compression of individual data. This is applied to Euclidean distortion, Hamming distortion, Kolmogorov distortion, and ShannonFano distortion. We show that in all these cases for every function satisfying the obvious constraints there are data that have this function as their individual ratedistortion function. Shannon’s distortionrate function over a random source is shown to be the pointswise asymptotic expectation of the individual distortionrate functions we have defined. The great differences in the distortionrate functions for individual nonrandom (that is, the aspects important to lossy compression) data we established were previously invisible and obliterated in the Shannon theory. The techniques are based on Kolmogorov complexity. I.
Algorithmic representation of visual information
, 2000
"... This thesis presents new perspectives to media representation and addresses fundamental source coding problems outside the umbrella of traditional information theory, namely, the representation of finite individual objects with a finite amount of computational resources. We start by proposing a new ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This thesis presents new perspectives to media representation and addresses fundamental source coding problems outside the umbrella of traditional information theory, namely, the representation of finite individual objects with a finite amount of computational resources. We start by proposing a new theory, Complexity Distortion Theory, which uses programmatic descriptions to provide a mathematical framework where these problems can be addressed. The key component of this theory is the substitution of the decoder in Shannon's communication system by a computer. The mathematical framework for examining issues of efficiency is then Kolmogorov Complexity Theory. Complexity Distortion Theory extends this framework to include distortion by defining the complexity distortion function, the equivalent to the rate distortion function in this algorithmic setting. We show that this information measure predicts asymptotically the same results as the classical probabilistic information measures, for stationary and ergodic sources. These equivalences
Entropy and Quantum Kolmogorov Complexity: A Quantum Brudno’s Theorem
 COMMUNICATIONS IN MATHEMATICAL PHYSICS
, 2006
"... In classical information theory, entropy rate and algorithmic complexity per symbol are related by a theorem of Brudno. In this paper, we prove a quantum version of this theorem, connecting the von Neumann entropy rate and two notions of quantum Kolmogorov complexity, both based on the shortest qub ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In classical information theory, entropy rate and algorithmic complexity per symbol are related by a theorem of Brudno. In this paper, we prove a quantum version of this theorem, connecting the von Neumann entropy rate and two notions of quantum Kolmogorov complexity, both based on the shortest qubit descriptions of qubit strings that, run by a universal quantum Turing machine, reproduce them as outputs.