Results 1 
4 of
4
The Information Lost in Erasures
, 2008
"... We consider sources and channels with memory observed through erasure channels. In particular, we examine the impact of sporadic erasures on the fundamental limits of lossless data compression, lossy data compression, channel coding, and denoising. We define the erasure entropy of a collection of ra ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
We consider sources and channels with memory observed through erasure channels. In particular, we examine the impact of sporadic erasures on the fundamental limits of lossless data compression, lossy data compression, channel coding, and denoising. We define the erasure entropy of a collection of random variables as the sum of entropies of the individual variables conditioned on all the rest. The erasure entropy measures the information content carried by each symbol knowing its context. The erasure entropy rate is shown to be the minimal amount of bits per erasure required to recover the lost information in the limit of small erasure probability. When we allow recovery of the erased symbols within a prescribed degree of distortion, the fundamental tradeoff is described by the erasure rate–distortion function which we characterize. We show that in the regime of sporadic erasures, knowledge at the encoder of the erasure locations does not lower the rate required to achieve a given distortion. When no additional encoded information is available, the erased information is reconstructed solely on the basis of its context by a denoiser. Connections between erasure entropy and discrete denoising are developed. The decrease of the capacity of channels with memory due to sporadic memoryless erasures is also characterized in wide generality.
On the information rates of the plenoptic function
 in ICIP
, 2006
"... The plenoptic function describes the visual information available to an observer at any point in space and time. Samples of the plenoptic function (POF) are seen in video and in general visual content (images, mosaics, panoramic scenes, etc), and represent large amounts of information. In this paper ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
The plenoptic function describes the visual information available to an observer at any point in space and time. Samples of the plenoptic function (POF) are seen in video and in general visual content (images, mosaics, panoramic scenes, etc), and represent large amounts of information. In this paper we propose a stochastic model to study the compression limits of a simplified version of the plenoptic function. In the proposed framework, we isolate the two fundamental sources of information in the POF: the one representing the camera motion and the other representing the information complexity of the “reality” being acquired and transmitted. The sources of information are combined, generating a stochastic process that we study in detail. We first propose a model for ensembles of realities that do not change over time. The proposed model is simple in that it enables us to derive precise coding bounds in the informationtheoretic sense that are sharp in a number of cases of practical interest. For this simple case of static realities and camera motion, our results indicate that coding practice is in accordance with optimal coding from an informationtheoretic standpoint. The model is further extended to account for visual realities that change over time. We derive bounds
Directed Information, Causal Estimation, and Communication in Continuous Time
"... Abstract—The notion of directed information is introduced for stochastic processes in continuous time. Properties and operational interpretations are presented for this notion of directed information, which generalizes mutual information between stochastic processes in a similar manner as Massey’s o ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract—The notion of directed information is introduced for stochastic processes in continuous time. Properties and operational interpretations are presented for this notion of directed information, which generalizes mutual information between stochastic processes in a similar manner as Massey’s original notion of directed information generalizes Shannon’s mutual information in the discretetime setting. As a key application, Duncan’s theorem is generalized to estimation problems in which the evolution of the target signal is affected by the past channel noise, and the causal minimum mean squared error estimation is related to directed information from the target signal to the observation corrupted by additive white Gaussian noise. An analogous relationship holds for the Poisson channel. The notion of directed information as a characterizing of the fundamental limit on reliable communication for a wide class of continuoustime channels with feedback is dicussed. I.
Directed Information, Causal Estimation, and Communication in Continuous Time
"... Abstract—A notion of directed information between two continuoustime processes is proposed. A key component in the definition is taking an infimum over all possible partitions of the time interval, which plays a role no less significant than the supremum over “space ” partitions inherent in the def ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract—A notion of directed information between two continuoustime processes is proposed. A key component in the definition is taking an infimum over all possible partitions of the time interval, which plays a role no less significant than the supremum over “space ” partitions inherent in the definition of mutual information. Properties and operational interpretations in estimation and communication are then established for the proposed notion of directed information. For the continuoustime additive white Gaussian noise channel, it is shown that Duncan’s classical relationship between causal estimation error and mutual information continues to hold in the presence of feedback upon replacing mutual information by directed information. A parallel result is established for the Poisson channel. The utility of this relationship is demonstrated in computing the directed information rate between the input and output processes of a continuoustime Poisson channel with feedback, where the channel input process is constrained to be constant between events at the channel output. Finally, the capacity of a wide class of continuoustime channels with feedback is established via directed information, characterizing the fundamental limit on reliable communication. Index Terms—Causal estimation, conditional mutual information, continuous time, directed information, Duncan’s theorem, feedback capacity, Gaussian channel, Poisson channel, time partition. I.