Results 1  10
of
43
Error Control and Concealment for Video Communication  A Review
 PROCEEDINGS OF THE IEEE
, 1998
"... The problem of error control and concealment in video communication is becoming increasingly important because of the growing interest in video delivery over unreliable channels such as wireless networks and the Internet. This paper reviews the techniques that have been developed for error control a ..."
Abstract

Cited by 316 (11 self)
 Add to MetaCart
The problem of error control and concealment in video communication is becoming increasingly important because of the growing interest in video delivery over unreliable channels such as wireless networks and the Internet. This paper reviews the techniques that have been developed for error control and concealment in the past ten to fifteen years. These techniques are described in three categories according to the roles that the encoder and decoder play in the underlying approaches. Forward error concealment includes methods that add redundancy at the source end to enhance error resilience of the coded bit streams. Error concealment by postprocessing refers to operations at the decoder to recover the damaged areas based on characteristics of image and video signals. Finally, interactive error concealment covers techniques that are dependent on a dialog between the source and destination. Both current research activities and practice in international standards are covered.
Fading Channels: InformationTheoretic And Communications Aspects
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 1998
"... In this paper we review the most peculiar and interesting informationtheoretic and communications features of fading channels. We first describe the statistical models of fading channels which are frequently used in the analysis and design of communication systems. Next, we focus on the information ..."
Abstract

Cited by 289 (1 self)
 Add to MetaCart
In this paper we review the most peculiar and interesting informationtheoretic and communications features of fading channels. We first describe the statistical models of fading channels which are frequently used in the analysis and design of communication systems. Next, we focus on the information theory of fading channels, by emphasizing capacity as the most important performance measure. Both singleuser and multiuser transmission are examined. Further, we describe how the structure of fading channels impacts code design, and finally overview equalization of fading multipath channels.
Systems with finite communication bandwidth constraints—I: State estimation problems
 Stanford University, Stanford, CA
, 1997
"... Abstract—In this paper a new class of feedback control problems is introduced. Unlike classical models, the systems considered here have communication channel constraints. As a result, the issue of coding and communication protocol becomes an integral part of the analysis. Since these systems cannot ..."
Abstract

Cited by 114 (1 self)
 Add to MetaCart
Abstract—In this paper a new class of feedback control problems is introduced. Unlike classical models, the systems considered here have communication channel constraints. As a result, the issue of coding and communication protocol becomes an integral part of the analysis. Since these systems cannot be asymptotically stabilized if the underlying dynamics are unstable, a weaker stability concept called containability is introduced. A key result connects containability with an inequality equation involving the communication data rate and the rate of change of the state. Index Terms — Asymptotic stability, containability, feedback control, Kraft inequality.
To Code, or Not to Code: Lossy SourceChannel Communication Revisited
 IEEE TRANS. INFORM. THEORY
, 2003
"... What makes a sourcechannel communication system optimal? It is shown that in order to achieve an optimal costdistortion tradeoff, the source and the channel have to be matched in a probabilistic sense. The match (or lack of it) involves the source distribution, the distortion measure, the channel ..."
Abstract

Cited by 88 (5 self)
 Add to MetaCart
What makes a sourcechannel communication system optimal? It is shown that in order to achieve an optimal costdistortion tradeoff, the source and the channel have to be matched in a probabilistic sense. The match (or lack of it) involves the source distribution, the distortion measure, the channel conditional distribution, and the channel input cost function. Closedform necessary and sufficient expressions relating the above entities are given. This generalizes both the separationbased approach as well as the two wellknown examples of optimal uncoded communication. The condition of
Stochastic Linear Control over a Communication Channel
, 2003
"... We examine linear stochastic control systems when there is a communication channel connecting the sensor to the controller. The problem consists of designing the channel encoder and decoder as well as the controller to satisfy some given control objectives. In particular we examine the role communic ..."
Abstract

Cited by 52 (8 self)
 Add to MetaCart
We examine linear stochastic control systems when there is a communication channel connecting the sensor to the controller. The problem consists of designing the channel encoder and decoder as well as the controller to satisfy some given control objectives. In particular we examine the role communication has on the classical LQG problem. We give conditions under which the classical separation property between estimation and control holds and the certainty equivalent control law is optimal. We then present the sequential rate distortion framework. We present bounds on the achievable performance and show the inherent tradeo#s between control and communication costs. In particular we show that optimal quadratic cost decomposes into two terms: a full knowledge cost and a sequential rate distortion cost.
The Necessity and Sufficiency of Anytime Capacity for Control over a Noisy Communication Link: Parts I and II
"... We review how Shannon's classical notion of capacity is not enough to characterize a noisy communication channel if we intend to use that channel as a part of a feedback loop to stabilize an unstable linear system. While classical capacity is not enough, another parametric sense of capacity called " ..."
Abstract

Cited by 24 (6 self)
 Add to MetaCart
We review how Shannon's classical notion of capacity is not enough to characterize a noisy communication channel if we intend to use that channel as a part of a feedback loop to stabilize an unstable linear system. While classical capacity is not enough, another parametric sense of capacity called "anytime capacity" is shown to be necessary for the stabilization of an unstable process. The rate required is given by the log of the system gain and the sense of reliability required comes from the sense of stability desired. A consequence of this necessity result is a sequential generalization of the Schalkwijk/Kailath scheme for communication over the AWGN channel with feedback. In cases of sufficiently...
Finite State Channels with TimeInvariant Deterministic Feedback
"... We consider capacity of discretetime channels with feedback for the general case where the feedback is a timeinvariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence ..."
Abstract

Cited by 23 (13 self)
 Add to MetaCart
We consider capacity of discretetime channels with feedback for the general case where the feedback is a timeinvariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence of upper bounds on the capacity. The achievable rates and the upper bounds are computable for any N, and the limits of the sequences exist. We show that when the probability of the initial state is positive for all the channelstates, then the capacity is the limit of the achievablerate sequence. We further show that when the channel is stationary, indecomposable and has no intersymbol interference (ISI), its capacity is given by the limit of the maximum of the (normalized) directed information between the input X N and the output Y N, i.e., 1 C = lim N→ ∞ N max I(XN → Y N), where the maximization is taken over the causal conditioning probability Q(x N z N−1) defined in this paper. The main idea for obtaining the results is to add causality into Gallager’s results [1] on finite state channels. The capacity results are used to show that the sourcechannel separation theorem holds for timeinvariant determinist feedback, and if the state of the channel is known both at the encoder and the decoder, then feedback does not increase capacity.
Joint sourcechannel coding error exponent for discrete communication systems with Markovian memory
 IEEE Trans. Info. Theory
, 2007
"... Abstract—We investigate the computation of Csiszár’s bounds for the joint source–channel coding (JSCC) error exponent of a communication system consisting of a discrete memoryless source and a discrete memoryless channel. We provide equivalent expressions for these bounds and derive explicit formula ..."
Abstract

Cited by 23 (9 self)
 Add to MetaCart
Abstract—We investigate the computation of Csiszár’s bounds for the joint source–channel coding (JSCC) error exponent of a communication system consisting of a discrete memoryless source and a discrete memoryless channel. We provide equivalent expressions for these bounds and derive explicit formulas for the rates where the bounds are attained. These equivalent representations can be readily computed for arbitrary source–channel pairs via Arimoto’s algorithm. When the channel’s distribution satisfies a symmetry property, the bounds admit closedform parametric expressions. We then use our results to provide a systematic comparison between the JSCC error exponent and the tandem coding error exponent, which applies if the source and channel are separately coded. It is shown that 2. We establish conditions for which and for which =2. Numerical examples indicate that is close to2 for many source– channel pairs. This gain translates into a power saving larger than 2 dB for a binary source transmitted over additive white Gaussian noise (AWGN) channels and Rayleighfading channels with finite output quantization. Finally, we study the computation of the lossy JSCC error exponent under the Hamming distortion measure. Index Terms—Discrete memoryless sources and channels, error exponent, Fenchel’s duality, Hamming distortion measure, joint source–channel coding, randomcoding exponent, reliability function, spherepacking exponent, symmetric channels, tandem source and channel coding. I.
Source coding and channel requirements for unstable processes
 IEEE Trans. Inf. Theory, Submitted, 2006. [Online]. Available: http://www.eecs.berkeley.edu/˜sahai/Papers/anytime.pdf
"... Our understanding of information in systems has been based on the foundation of memoryless processes. Extensions to stable Markov and autoregressive processes are classical. Berger proved a source coding theorem for the marginally unstable Wiener process, but the infinitehorizon exponentially unst ..."
Abstract

Cited by 14 (10 self)
 Add to MetaCart
Our understanding of information in systems has been based on the foundation of memoryless processes. Extensions to stable Markov and autoregressive processes are classical. Berger proved a source coding theorem for the marginally unstable Wiener process, but the infinitehorizon exponentially unstable case has been open since Gray’s 1970 paper. There were also no theorems showing what is needed to communicate such processes across noisy channels. In this work, we give a fixedrate sourcecoding theorem for the infinitehorizon problem of coding an exponentially unstable Markov process. The encoding naturally results in two distinct bitstreams that have qualitatively different QoS requirements for communicating over a noisy medium. The first stream captures the information that is accumulating within the nonstationary process and requires sufficient anytime reliability from the channel used to communicate the process. The second stream captures the historical information that dissipates within the process and is essentially classical. This historical information can also be identified with a natural stable counterpart to the unstable process. A converse demonstrating the fundamentally layered nature of unstable sources is given by means of informationembedding ideas.