Results 1  10
of
81
Error Control and Concealment for Video Communication  A Review
 PROCEEDINGS OF THE IEEE
, 1998
"... The problem of error control and concealment in video communication is becoming increasingly important because of the growing interest in video delivery over unreliable channels such as wireless networks and the Internet. This paper reviews the techniques that have been developed for error control a ..."
Abstract

Cited by 436 (13 self)
 Add to MetaCart
(Show Context)
The problem of error control and concealment in video communication is becoming increasingly important because of the growing interest in video delivery over unreliable channels such as wireless networks and the Internet. This paper reviews the techniques that have been developed for error control and concealment in the past ten to fifteen years. These techniques are described in three categories according to the roles that the encoder and decoder play in the underlying approaches. Forward error concealment includes methods that add redundancy at the source end to enhance error resilience of the coded bit streams. Error concealment by postprocessing refers to operations at the decoder to recover the damaged areas based on characteristics of image and video signals. Finally, interactive error concealment covers techniques that are dependent on a dialog between the source and destination. Both current research activities and practice in international standards are covered.
Fading Channels: InformationTheoretic And Communications Aspects
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 1998
"... In this paper we review the most peculiar and interesting informationtheoretic and communications features of fading channels. We first describe the statistical models of fading channels which are frequently used in the analysis and design of communication systems. Next, we focus on the information ..."
Abstract

Cited by 416 (3 self)
 Add to MetaCart
In this paper we review the most peculiar and interesting informationtheoretic and communications features of fading channels. We first describe the statistical models of fading channels which are frequently used in the analysis and design of communication systems. Next, we focus on the information theory of fading channels, by emphasizing capacity as the most important performance measure. Both singleuser and multiuser transmission are examined. Further, we describe how the structure of fading channels impacts code design, and finally overview equalization of fading multipath channels.
Systems with finite communication bandwidth constraints—I: State estimation problems
 Stanford University, Stanford, CA
, 1997
"... Abstract—In this paper a new class of feedback control problems is introduced. Unlike classical models, the systems considered here have communication channel constraints. As a result, the issue of coding and communication protocol becomes an integral part of the analysis. Since these systems cannot ..."
Abstract

Cited by 209 (5 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper a new class of feedback control problems is introduced. Unlike classical models, the systems considered here have communication channel constraints. As a result, the issue of coding and communication protocol becomes an integral part of the analysis. Since these systems cannot be asymptotically stabilized if the underlying dynamics are unstable, a weaker stability concept called containability is introduced. A key result connects containability with an inequality equation involving the communication data rate and the rate of change of the state. Index Terms — Asymptotic stability, containability, feedback control, Kraft inequality.
To Code, or Not to Code: Lossy SourceChannel Communication Revisited
 IEEE TRANS. INFORM. THEORY
, 2003
"... What makes a sourcechannel communication system optimal? It is shown that in order to achieve an optimal costdistortion tradeoff, the source and the channel have to be matched in a probabilistic sense. The match (or lack of it) involves the source distribution, the distortion measure, the channel ..."
Abstract

Cited by 160 (7 self)
 Add to MetaCart
What makes a sourcechannel communication system optimal? It is shown that in order to achieve an optimal costdistortion tradeoff, the source and the channel have to be matched in a probabilistic sense. The match (or lack of it) involves the source distribution, the distortion measure, the channel conditional distribution, and the channel input cost function. Closedform necessary and sufficient expressions relating the above entities are given. This generalizes both the separationbased approach as well as the two wellknown examples of optimal uncoded communication. The condition of
Stochastic Linear Control over a Communication Channel
, 2003
"... We examine linear stochastic control systems when there is a communication channel connecting the sensor to the controller. The problem consists of designing the channel encoder and decoder as well as the controller to satisfy some given control objectives. In particular we examine the role communic ..."
Abstract

Cited by 84 (9 self)
 Add to MetaCart
(Show Context)
We examine linear stochastic control systems when there is a communication channel connecting the sensor to the controller. The problem consists of designing the channel encoder and decoder as well as the controller to satisfy some given control objectives. In particular we examine the role communication has on the classical LQG problem. We give conditions under which the classical separation property between estimation and control holds and the certainty equivalent control law is optimal. We then present the sequential rate distortion framework. We present bounds on the achievable performance and show the inherent tradeo#s between control and communication costs. In particular we show that optimal quadratic cost decomposes into two terms: a full knowledge cost and a sequential rate distortion cost.
Hadamardbased Soft Decoding for Vector Quantization over Noisy Channels
, 1998
"... We present an estimatorbased, or soft, vector quantizer decoder for communication over a noisy channel. The decoder is optimal according to the meansquare error criterion, and Hadamardbased in the sense that a Hadamard transform representation of the vector quantizer is utilized in the implemen ..."
Abstract

Cited by 56 (12 self)
 Add to MetaCart
We present an estimatorbased, or soft, vector quantizer decoder for communication over a noisy channel. The decoder is optimal according to the meansquare error criterion, and Hadamardbased in the sense that a Hadamard transform representation of the vector quantizer is utilized in the implementation of the decoder. An efficient algorithm for optimal decoding is derived. We furthermore investigate suboptimal versions of the decoder, providing good performance at lower complexity. The issue of joint encoderdecoder design is considered both for optimal and suboptimal decoding. Results regarding the channel distortion and the structure of a channel robust code are also provided. Through numerical simulations, soft decoding is demonstrated to outperform hard decoding in several aspects.
Finite State Channels with TimeInvariant Deterministic Feedback
"... We consider capacity of discretetime channels with feedback for the general case where the feedback is a timeinvariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence ..."
Abstract

Cited by 51 (26 self)
 Add to MetaCart
(Show Context)
We consider capacity of discretetime channels with feedback for the general case where the feedback is a timeinvariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence of upper bounds on the capacity. The achievable rates and the upper bounds are computable for any N, and the limits of the sequences exist. We show that when the probability of the initial state is positive for all the channelstates, then the capacity is the limit of the achievablerate sequence. We further show that when the channel is stationary, indecomposable and has no intersymbol interference (ISI), its capacity is given by the limit of the maximum of the (normalized) directed information between the input X N and the output Y N, i.e., 1 C = lim N→ ∞ N max I(XN → Y N), where the maximization is taken over the causal conditioning probability Q(x N z N−1) defined in this paper. The main idea for obtaining the results is to add causality into Gallager’s results [1] on finite state channels. The capacity results are used to show that the sourcechannel separation theorem holds for timeinvariant determinist feedback, and if the state of the channel is known both at the encoder and the decoder, then feedback does not increase capacity.
Fifty Years of Shannon Theory
, 1998
"... A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication. ..."
Abstract

Cited by 49 (1 self)
 Add to MetaCart
A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication.
The Necessity and Sufficiency of Anytime Capacity for Control over a Noisy Communication Link: Parts I and II
"... We review how Shannon's classical notion of capacity is not enough to characterize a noisy communication channel if we intend to use that channel as a part of a feedback loop to stabilize an unstable linear system. While classical capacity is not enough, another parametric sense of capacity cal ..."
Abstract

Cited by 39 (7 self)
 Add to MetaCart
We review how Shannon's classical notion of capacity is not enough to characterize a noisy communication channel if we intend to use that channel as a part of a feedback loop to stabilize an unstable linear system. While classical capacity is not enough, another parametric sense of capacity called "anytime capacity" is shown to be necessary for the stabilization of an unstable process. The rate required is given by the log of the system gain and the sense of reliability required comes from the sense of stability desired. A consequence of this necessity result is a sequential generalization of the Schalkwijk/Kailath scheme for communication over the AWGN channel with feedback. In cases of sufficiently...
Joint sourcechannel coding error exponent for discrete communication systems with Markovian memory
 IEEE Trans. Info. Theory
, 2007
"... Abstract—We investigate the computation of Csiszár’s bounds for the joint source–channel coding (JSCC) error exponent of a communication system consisting of a discrete memoryless source and a discrete memoryless channel. We provide equivalent expressions for these bounds and derive explicit formula ..."
Abstract

Cited by 33 (11 self)
 Add to MetaCart
(Show Context)
Abstract—We investigate the computation of Csiszár’s bounds for the joint source–channel coding (JSCC) error exponent of a communication system consisting of a discrete memoryless source and a discrete memoryless channel. We provide equivalent expressions for these bounds and derive explicit formulas for the rates where the bounds are attained. These equivalent representations can be readily computed for arbitrary source–channel pairs via Arimoto’s algorithm. When the channel’s distribution satisfies a symmetry property, the bounds admit closedform parametric expressions. We then use our results to provide a systematic comparison between the JSCC error exponent and the tandem coding error exponent, which applies if the source and channel are separately coded. It is shown that 2. We establish conditions for which and for which =2. Numerical examples indicate that is close to2 for many source– channel pairs. This gain translates into a power saving larger than 2 dB for a binary source transmitted over additive white Gaussian noise (AWGN) channels and Rayleighfading channels with finite output quantization. Finally, we study the computation of the lossy JSCC error exponent under the Hamming distortion measure. Index Terms—Discrete memoryless sources and channels, error exponent, Fenchel’s duality, Hamming distortion measure, joint source–channel coding, randomcoding exponent, reliability function, spherepacking exponent, symmetric channels, tandem source and channel coding. I.