Results 1  10
of
81
DETERMINANT MAXIMIZATION WITH LINEAR MATRIX INEQUALITY CONSTRAINTS
"... The problem of maximizing the determinant of a matrix subject to linear matrix inequalities arises in many fields, including computational geometry, statistics, system identification, experiment design, and information and communication theory. It can also be considered as a generalization of the s ..."
Abstract

Cited by 230 (18 self)
 Add to MetaCart
The problem of maximizing the determinant of a matrix subject to linear matrix inequalities arises in many fields, including computational geometry, statistics, system identification, experiment design, and information and communication theory. It can also be considered as a generalization of the semidefinite programming problem. We give an overview of the applications of the determinant maximization problem, pointing out simple cases where specialized algorithms or analytical solutions are known. We then describe an interiorpoint method, with a simplified analysis of the worstcase complexity and numerical results that indicate that the method is very efficient, both in theory and in practice. Compared to existing specialized algorithms (where they are available), the interiorpoint method will generally be slower; the advantage is that it handles a much wider variety of problems.
The capacity of channels with feedback
 IEEE Trans. Information Theory
, 2009
"... We introduce a general framework for treating channels with memory and feedback. First, we generalize Massey’s concept of directed information [23] and use it to characterize the feedback capacity of general channels. Second, we present coding results for Markov channels. This requires determining a ..."
Abstract

Cited by 97 (4 self)
 Add to MetaCart
(Show Context)
We introduce a general framework for treating channels with memory and feedback. First, we generalize Massey’s concept of directed information [23] and use it to characterize the feedback capacity of general channels. Second, we present coding results for Markov channels. This requires determining appropriate sufficient statistics at the encoder and decoder. Third, a dynamic programming framework for computing the capacity of Markov channels is presented. Fourth, it is shown that the average cost optimality equation (ACOE) can be viewed as an implicit singleletter characterization of the capacity. Fifth, scenarios
Finite State Channels with TimeInvariant Deterministic Feedback
"... We consider capacity of discretetime channels with feedback for the general case where the feedback is a timeinvariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence ..."
Abstract

Cited by 51 (26 self)
 Add to MetaCart
(Show Context)
We consider capacity of discretetime channels with feedback for the general case where the feedback is a timeinvariant deterministic function of the output samples. Under the assumption that the channel states take values in a finite alphabet, we find a sequence of achievable rates and a sequence of upper bounds on the capacity. The achievable rates and the upper bounds are computable for any N, and the limits of the sequences exist. We show that when the probability of the initial state is positive for all the channelstates, then the capacity is the limit of the achievablerate sequence. We further show that when the channel is stationary, indecomposable and has no intersymbol interference (ISI), its capacity is given by the limit of the maximum of the (normalized) directed information between the input X N and the output Y N, i.e., 1 C = lim N→ ∞ N max I(XN → Y N), where the maximization is taken over the causal conditioning probability Q(x N z N−1) defined in this paper. The main idea for obtaining the results is to add causality into Gallager’s results [1] on finite state channels. The capacity results are used to show that the sourcechannel separation theorem holds for timeinvariant determinist feedback, and if the state of the channel is known both at the encoder and the decoder, then feedback does not increase capacity.
Fifty Years of Shannon Theory
, 1998
"... A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication. ..."
Abstract

Cited by 49 (1 self)
 Add to MetaCart
A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication.
Feedback capacity of stationary Gaussian channels
"... The capacity of stationary additive Gaussian noise channels with feedback is characterized as the solution to a variational problem. Toward this end, it is proved that the optimal feedback coding scheme is stationary. When specialized to the firstorder autoregressive movingaverage noise spectrum, ..."
Abstract

Cited by 48 (11 self)
 Add to MetaCart
(Show Context)
The capacity of stationary additive Gaussian noise channels with feedback is characterized as the solution to a variational problem. Toward this end, it is proved that the optimal feedback coding scheme is stationary. When specialized to the firstorder autoregressive movingaverage noise spectrum, this variational characterization yields a closedform expression for the feedback capacity. In particular, this result shows that the celebrated Schalkwijk–Kailath coding scheme achieves the feedback capacity for the firstorder autoregressive movingaverage Gaussian channel, resolving a longstanding open problem studied by Butman, Schalkwijk– Tiernan, Wolfowitz, Ozarow, Ordentlich, Yang–Kavčić–Tatikonda, and others. 1 Introduction and
Writing on colored paper
 in Proc. of ISIT
, 2001
"... A Gaussian channel, when corrupted by an additive Gaussian interfering signal whose complete sample sequence is known noncausally to the transmitter but not to the receiver, has the same capacity as if the interfering signal were not present. This is true even when the noise and interference are no ..."
Abstract

Cited by 36 (5 self)
 Add to MetaCart
(Show Context)
A Gaussian channel, when corrupted by an additive Gaussian interfering signal whose complete sample sequence is known noncausally to the transmitter but not to the receiver, has the same capacity as if the interfering signal were not present. This is true even when the noise and interference are not necessarily stationary or ergodic. 1
A coding theorem for a class of stationary channel with feedback
 IEEE Trans. Inf. Theory
"... ar ..."
(Show Context)
The feedback capacity of the firstorder moving average Gaussian channel. Accepted by
 IEEE Trans. Inform. Theory
, 2006
"... Abstract—Despite numerous bounds and partial results, the feedback capacity of the stationary nonwhite Gaussian additive noise channel has been open, even for the simplest cases such as the firstorder autoregressive Gaussian channel studied by Butman, Tiernan and Schalkwijk, Wolfowitz, Ozarow, and ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Despite numerous bounds and partial results, the feedback capacity of the stationary nonwhite Gaussian additive noise channel has been open, even for the simplest cases such as the firstorder autoregressive Gaussian channel studied by Butman, Tiernan and Schalkwijk, Wolfowitz, Ozarow, and more recently, Yang, Kavčić, and Tatikonda. Here we consider another simple special case of the stationary firstorder moving average additive Gaussian noise channel and find the feedback capacity in closed form. Specifically, the channel is given by = + =12... where the input satisfies a power constraint and the noise is a firstorder moving average Gaussian process defined by = 1 + 1 with white Gaussian innovations =0 1... We show that the feedback capacity of this channel is. We wish to communicate a message index reliably over the channel. The channel output is causally fed back to the transmitter. We specify a code with the codewords1 satisfying the expected power constraint The proband decoding function ability of error is defined by FB = log 0 where 0 is the unique positive root of the equation
Fountain Capacity
, 2006
"... Fountain codes have been successfully employed for reliable and efficient transmission of information via erasure channels with unknown erasure rates. This paper introduces the notion of fountain capacity for arbitrary channels, and shows that it is equal to the conventional Shannon capacity for st ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Fountain codes have been successfully employed for reliable and efficient transmission of information via erasure channels with unknown erasure rates. This paper introduces the notion of fountain capacity for arbitrary channels, and shows that it is equal to the conventional Shannon capacity for stationary memoryless channels. In contrast, when the channel is not stationary or has memory, Shannon capacity and fountain capacity need not be equal.
An upper bound to the capacity of discrete time Gaussian channel with feedback
 Department of Applied Science, Faculty of Engineering Yamaguchi University, Ube 755, Japan
, 1994
"... The following model for the discrete time Gaussian channel with feedback is considered: Yn = Sn + Zn, n = 1, 2,... where Z = {Zn;n = 1, 2,...} is a nondegenerate, zero mean Gaussian process ..."
Abstract

Cited by 14 (9 self)
 Add to MetaCart
(Show Context)
The following model for the discrete time Gaussian channel with feedback is considered: Yn = Sn + Zn, n = 1, 2,... where Z = {Zn;n = 1, 2,...} is a nondegenerate, zero mean Gaussian process