Results 1  10
of
44
The Secrecy Capacity Region of the Gaussian MIMO MultiReceiver Wiretap Channel
, 2009
"... In this paper, we consider the Gaussian multipleinput multipleoutput (MIMO) multireceiver wiretap channel in which a transmitter wants to have confidential communication with an arbitrary number of users in the presence of an external eavesdropper. We derive the secrecy capacity region of this ch ..."
Abstract

Cited by 32 (18 self)
 Add to MetaCart
In this paper, we consider the Gaussian multipleinput multipleoutput (MIMO) multireceiver wiretap channel in which a transmitter wants to have confidential communication with an arbitrary number of users in the presence of an external eavesdropper. We derive the secrecy capacity region of this channel for the most general case. We first show that even for the singleinput singleoutput (SISO) case, existing converse techniques for the Gaussian scalar broadcast channel cannot be extended to this secrecy context, to emphasize the need for a new proof technique. Our new proof technique makes use of the relationships between the minimummeansquareerror and the mutual information, and equivalently, the relationships between the Fisher information and the differential entropy. Using the intuition gained from the converse proof of the SISO channel, we first prove the secrecy capacity region of the degraded MIMO channel, in which all receivers have the same number of antennas, and the noise covariance matrices can be arranged according to a positive semidefinite order. We then generalize this result to the aligned case, in which all receivers have the same number of antennas, however there is no order among the noise covariance matrices. We accomplish this task by using the channel enhancement technique. Finally, we find the secrecy capacity region of the general MIMO channel by using some limiting arguments on the secrecy capacity region of the aligned MIMO channel. We show that the capacity achieving coding scheme is a variant of dirtypaper coding with Gaussian signals.
Information theoretic proofs of entropy power inequalities,” arXiv:0704.1751v1 [cs.IT
, 2007
"... Abstract—While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon’s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entro ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
Abstract—While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon’s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum meansquare error (MMSE), which are derived from de Bruijn’s identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariancepreserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman’s Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai, and Verdú used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder’s generalized EPI for linear transformations of the random variables, Takano and Johnson’s EPI for dependent variables, Liu and Viswanath’s covarianceconstrained EPI, and Costa’s concavity inequality for the entropy power. Index Terms—Data processing inequality, de Bruijn’s identity, differential entropy, divergence, entropy power inequality (EPI),
MIMO radar waveform design based on mutual information and minimum meansquare error estimation
 IEEE Transactions on Aerospace and Electronic Systems
, 2007
"... Abstract — This paper addresses the problem of radar waveform design for target identification and classification. Both the ordinary radar with a single transmitter and receiver and the recently proposed multipleinput multipleoutput (MIMO) radar are considered. A random target impulse response is ..."
Abstract

Cited by 14 (3 self)
 Add to MetaCart
Abstract — This paper addresses the problem of radar waveform design for target identification and classification. Both the ordinary radar with a single transmitter and receiver and the recently proposed multipleinput multipleoutput (MIMO) radar are considered. A random target impulse response is used to model the scattering characteristics of the extended (nonpoint) target, and two radar waveform design problems with constraints on waveform power have been investigated. The first one is to design waveforms that maximize the conditional mutual information (MI) between the random target impulse response and the reflected waveforms given the knowledge of transmitted waveforms. The second one is to find transmitted waveforms that minimize the meansquare error (MSE) in estimating the target impulse response. Our analysis indicates that under the same total power constraint, these two criteria lead to the same solution for a matrix which specifies the essential part of the optimum waveform design. The solution employs waterfilling to allocate the limited power appropriately. We also present an asymptotic formulation which requires less knowledge of the statistical model of the target. Index Terms — Multipleinput multipleoutput (MIMO) radar, radar waveform design, identification, classification, extended radar targets, mutual information (MI), minimum meansquare error (MMSE), waveform diversity. I.
Proof of entropy power inequalities via MMSE
 in Proceedings of the IEEE International Symposium on Information Theory
, 2006
"... Abstract — The differential entropy of a random variable (or vector) can be expressed as the integral over signaltonoise ratio (SNR) of the minimum meansquare error (MMSE) of estimating the variable (or vector) when observed in additive Gaussian noise. This representation sidesteps Fisher’s infor ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Abstract — The differential entropy of a random variable (or vector) can be expressed as the integral over signaltonoise ratio (SNR) of the minimum meansquare error (MMSE) of estimating the variable (or vector) when observed in additive Gaussian noise. This representation sidesteps Fisher’s information to provide simple and insightful proofs for Shannon’s entropy power inequality (EPI) and two of its variations: Costa’s strengthened EPI in the case in which one of the variables is Gaussian, and a generalized EPI for linear transformations of a random vector due to Zamir and Feder. I.
Degraded Compound Multireceiver Wiretap Channels ∗
, 2009
"... In this paper, we study the degraded compound multireceiver wiretap channel. The degraded compound multireceiver wiretap channel consists of two groups of users and a group of eavesdroppers, where, if we pick an arbitrary user from each group of users and an arbitrary eavesdropper, they satisfy a ..."
Abstract

Cited by 11 (9 self)
 Add to MetaCart
In this paper, we study the degraded compound multireceiver wiretap channel. The degraded compound multireceiver wiretap channel consists of two groups of users and a group of eavesdroppers, where, if we pick an arbitrary user from each group of users and an arbitrary eavesdropper, they satisfy a certain Markov chain. We study two different communication scenarios for this channel. In the first scenario, the transmitter wants to send a confidential message to users in the first (stronger) group and a different confidential message to users in the second (weaker) group, where both messages need to be kept confidential from the eavesdroppers. For this scenario, we assume that there is only one eavesdropper. We obtain the secrecy capacity region for the general discrete memoryless channel model, the parallel channel model, and the Gaussian parallel channel model. For the Gaussian multipleinput multipleoutput (MIMO) channel model, we obtain the secrecy capacity region when there is only one user in the second group. In the second scenario we study, the transmitter sends a confidential message to users in the first group which needs to be kept confidential from the second group of users and the eavesdroppers. Furthermore, the transmitter sends a different confidential message to users in the second group which needs to be kept confidential only from the eavesdroppers. For this scenario, we do not put any restriction on the number of eavesdroppers. As in the first scenario, we obtain the secrecy capacity region for the general discrete memoryless channel model, the parallel channel model, and the Gaussian parallel channel model. For the Gaussian MIMO channel model, we establish the secrecy capacity region when there is only one user in the second group.
Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels
 IEEE Trans. Inf. Theory
, 2009
"... Abstract—Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum m ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
Abstract—Within the framework of linear vector Gaussian channels with arbitrary signaling, the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to informationtheoretic quantities through differentiation, the Hessian of the mutual information and the entropy are derived. These expressions are then used to assess the concavity properties of mutual information and entropy under different channel conditions and also to derive a multivariate version of an entropy power inequality due to Costa. Index Terms—Concavity properties, differential entropy, entropy power, Fisher information matrix, Gaussian noise, Hessian matrices, linear vector Gaussian channels, minimum meansquare
Estimation in Gaussian Noise: Properties of the minimum meansquare error
 IEEE Trans. Inf. Theory
, 2011
"... Abstract—Consider the minimum meansquare error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signaltonoise ratio (SNR) as well as a functional of the input distribution (of the random variable t ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Abstract—Consider the minimum meansquare error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signaltonoise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all positive SNR, and in fact a real analytic function in SNR under mild conditions. The key to these regularity results is that the posterior distribution conditioned on the observation through Gaussian channels always decays at least as quickly as some Gaussian density. Furthermore, simple expressions for the first three derivatives of the MMSE with respect to the SNR are obtained. It is also shown that, as functions of the SNR, the curves for the MMSE of a Gaussian input and that of a nonGaussian input cross at most once over all SNRs. These properties lead to simple proofs of the facts that Gaussian inputs achieve both the secrecy capacity of scalar Gaussian wiretap channels and the capacity of scalar Gaussian broadcast channels, as well as a simple proof of the entropy power inequality in the special case where one of the variables is Gaussian. Index Terms—Entropy, estimation, Gaussian broadcast channel, Gaussian noise, Gaussian wiretap channel, minimum mean square error (MMSE), mutual information. I.
Shitz), “A vector generalization of Costa’s entropypower inequality with applications
 IEEE Trans. Inf. Theory
, 2010
"... This paper considers an entropypower inequality (EPI) of Costa and presents a natural vector generalization with a real positive semidefinite matrix parameter. This new inequality is proved using a perturbation approach via a fundamental relationship between the derivative of mutual information and ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
This paper considers an entropypower inequality (EPI) of Costa and presents a natural vector generalization with a real positive semidefinite matrix parameter. This new inequality is proved using a perturbation approach via a fundamental relationship between the derivative of mutual information and the minimum meansquare error (MMSE) estimate in linear vector Gaussian channels. As an application, a new extremal entropy inequality is derived from the generalized Costa EPI and then used to establish the secrecy capacity regions of the degraded vector Gaussian broadcast channel with layered confidential messages. Index Terms Entropypower inequality (EPI), extremal entropy inequality, informationtheoretic security, mutual information and minimum meansquare error (MMSE) estimate, vector Gaussian broadcast channel I.
Optimal transmit covariance for MIMO channels with statistical transmitter side informaiton
 in IEEE Int. Symp. on Inform. Theory, ISIT’05
, 2005
"... Information ..."
Representation of Mutual Information Via Input Estimates
"... Abstract—A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum meansquare error. This paper generalizes the link between information theory and estimation theory to arbitrary channe ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Abstract—A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum meansquare error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language. Index Terms—Computation of mutual information, extrinsic information, input estimation, lowdensity paritycheck (LDPC) codes, minimum mean square error (MMSE), mutual information, soft channel decoding. I.