Results 1  10
of
110
Error Control and Concealment for Video Communication  A Review
 PROCEEDINGS OF THE IEEE
, 1998
"... The problem of error control and concealment in video communication is becoming increasingly important because of the growing interest in video delivery over unreliable channels such as wireless networks and the Internet. This paper reviews the techniques that have been developed for error control a ..."
Abstract

Cited by 436 (13 self)
 Add to MetaCart
The problem of error control and concealment in video communication is becoming increasingly important because of the growing interest in video delivery over unreliable channels such as wireless networks and the Internet. This paper reviews the techniques that have been developed for error control and concealment in the past ten to fifteen years. These techniques are described in three categories according to the roles that the encoder and decoder play in the underlying approaches. Forward error concealment includes methods that add redundancy at the source end to enhance error resilience of the coded bit streams. Error concealment by postprocessing refers to operations at the decoder to recover the damaged areas based on characteristics of image and video signals. Finally, interactive error concealment covers techniques that are dependent on a dialog between the source and destination. Both current research activities and practice in international standards are covered.
Lossy Source Coding
 IEEE Trans. Inform. Theory
, 1998
"... Lossy coding of speech, highquality audio, still images, and video is commonplace today. However, in 1948, few lossy compression systems were in service. Shannon introduced and developed the theory of source coding with a fidelity criterion, also called ratedistortion theory. For the first 25 year ..."
Abstract

Cited by 104 (1 self)
 Add to MetaCart
(Show Context)
Lossy coding of speech, highquality audio, still images, and video is commonplace today. However, in 1948, few lossy compression systems were in service. Shannon introduced and developed the theory of source coding with a fidelity criterion, also called ratedistortion theory. For the first 25 years of its existence, ratedistortion theory had relatively little impact on the methods and systems actually used to compress real sources. Today, however, ratedistortion theoretic concepts are an important component of many lossy compression techniques and standards. We chronicle the development of ratedistortion theory and provide an overview of its influence on the practice of lossy source coding. Index TermsData compression, image coding, speech coding, rate distortion theory, signal coding, source coding with a fidelity criterion, video coding. I.
Multiple Description Coding Using Pairwise Correlating Transforms
 IEEE Trans. Image Processing
, 1999
"... The objective of multiple description coding (MDC) is to encode a source into two (or more) bitstreams supporting two quality levels of decoding. A highquality reconstruction should be decodable from the two bitstreams together, while lower, but still acceptable, quality reconstructions should b ..."
Abstract

Cited by 93 (1 self)
 Add to MetaCart
(Show Context)
The objective of multiple description coding (MDC) is to encode a source into two (or more) bitstreams supporting two quality levels of decoding. A highquality reconstruction should be decodable from the two bitstreams together, while lower, but still acceptable, quality reconstructions should be decodable from either of the two individual bitstreams. This paper describes techniques for meeting MDC objectives in the framework of standard transformbased image coding through the design of pairwise transforms.
An extremal inequality motivated by multiterminal information theoretic problems
, 2006
"... We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problem. As a corollary, this inequality yields a generalization of the classical vector entropypower inequality (EPI). As another c ..."
Abstract

Cited by 82 (4 self)
 Add to MetaCart
(Show Context)
We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problem. As a corollary, this inequality yields a generalization of the classical vector entropypower inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two jointly distributed random variables.
A.: Video coding for streaming media delivery on the Internet
 IEEE Transactions on Circuits and Systems for Video Technology
"... ..."
(Show Context)
Multiple Description Wavelet Based Image Coding
, 1998
"... We consider the problem of coding images for transmission over errorprone channels. The impairments we target are transient channel shutdowns, as would occur in a packet network when a packet is lost, or in a wireless system during a deep fade: when data is delivered it is assumed to be errorfree, ..."
Abstract

Cited by 80 (8 self)
 Add to MetaCart
We consider the problem of coding images for transmission over errorprone channels. The impairments we target are transient channel shutdowns, as would occur in a packet network when a packet is lost, or in a wireless system during a deep fade: when data is delivered it is assumed to be errorfree, but some of the data may never reach the receiver. The proposed algorithms are based on a combination of multiple description scalar quantizers with techniques successfully applied to the construction of some of the most ecient subband coders. A given image is encoded into multiple independent packets of roughly equal length. When packets are lost, the quality of the approximation computed at the receiver depends only on the number of packets received, but does not depend on exactly which packets are actually received. When compared with previously reported results on the performance of robust image coders based on multiple descriptions, on standard test images, our coders attain s...
nchannel symmetric multiple descriptions–part I: (n, k) sourcechannel erasure codes
 IEEE Trans. Inform. Theory
, 2004
"... ..."
Generalized entropy power inequalities and monotonicity properties of information
 IEEE Trans. Inform. Theory
, 2007
"... New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary ..."
Abstract

Cited by 53 (7 self)
 Add to MetaCart
(Show Context)
New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of i.i.d. summands as well as in the more general setting of independent summands with variancestandardized sums. 1
Sourcechannel diversity for parallel channels
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 2005
"... We consider transmitting a source across a pair of independent, nonergodic channels with random states (e.g., slowfading channels) so as to minimize the average distortion. The general problem is unsolved. Hence, we focus on comparing two commonly used source and channel encoding systems which corr ..."
Abstract

Cited by 49 (5 self)
 Add to MetaCart
(Show Context)
We consider transmitting a source across a pair of independent, nonergodic channels with random states (e.g., slowfading channels) so as to minimize the average distortion. The general problem is unsolved. Hence, we focus on comparing two commonly used source and channel encoding systems which correspond to exploiting diversity either at the physical layer through parallel channel coding or at the application layer through multiple description (MD) source coding. For on–off channel models, source coding diversity offers better performance. For channels with a continuous range of reception quality, we show the reverse is true. Specifically, we introduce a new figure of merit called the distortion exponent which measures how fast the average distortion decays with signaltonoise ratio. For continuousstate models such as additive white Gaussian noise (AWGN) channels with multiplicative Rayleigh fading, optimal channel coding diversity at the physical layer is more efficient than source coding diversity at the application layer in that the former achieves a better distortion exponent. Finally, we consider a third decoding architecture: MD encoding with joint source–channel decoding. We show that this architecture achieves the same distortion exponent as systems with optimal channel coding diversity for continuousstate channels, and maintains the advantages of MD systems for on–off channels. Thus, the MD system with joint decoding achieves the best performance from among the three architectures considered, on both continuousstate and on–off channels.