Results 1  10
of
37
Binary intersymbol interference channels: Gallager codes, density evolution and code performance bounds
 IEEE TRANS. INFORM. THEORY
, 2003
"... We study the limits of performance of Gallager codes (lowdensity paritycheck (LDPC) codes) over binary linear intersymbol interference (ISI) channels with additive white Gaussian noise (AWGN). Using the graph representations of the channel, the code, and the sum–product messagepassing detector/d ..."
Abstract

Cited by 49 (4 self)
 Add to MetaCart
We study the limits of performance of Gallager codes (lowdensity paritycheck (LDPC) codes) over binary linear intersymbol interference (ISI) channels with additive white Gaussian noise (AWGN). Using the graph representations of the channel, the code, and the sum–product messagepassing detector/decoder, we prove two error concentration theorems. Our proofs expand on previous work by handling complications introduced by the channel memory. We circumvent these problems by considering not just linear Gallager codes but also their cosets and by distinguishing between different types of message flow neighborhoods depending on the actual transmitted symbols. We compute the noise tolerance threshold using a suitably developed density evolution algorithm and verify, by simulation, that the thresholds represent accurate predictions of the performance of the iterative sum–product algorithm for finite (but large) block lengths. We also demonstrate that for high rates, the thresholds are very close to the theoretical limit of performance for Gallager codes over ISI channels. If g denotes the capacity of a binary ISI channel and if g � � � denotes the maximal achievable mutual information rate when the channel inputs are independent and identically distributed (i.i.d.) binary random variables @g � � � gA, we prove that the maximum information rate achievable by the sum–product decoder of a Gallager (coset) code is upperbounded by g � � �. The last topic investigated is the performance limit of the decoder if the trellis portion of the sum–product algorithm is executed only once; this demonstrates the potential for trading off the computational requirements and the performance of the decoder.
Computation of SymbolWise Mutual Information in Transmission Systems with LogAPP Decoders and Application to EXIT Charts
, 2004
"... The symbolwise mutual information between the binary inputs of a channel encoder and the softoutputs of a LogAPP decoder, i.e., the aposteriori loglikelihood ratios (LLRs), is analyzed. This mutual information can be expressed as the expectation of a function of solely the absolute values of the ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
The symbolwise mutual information between the binary inputs of a channel encoder and the softoutputs of a LogAPP decoder, i.e., the aposteriori loglikelihood ratios (LLRs), is analyzed. This mutual information can be expressed as the expectation of a function of solely the absolute values of the aposteriori LLRs. This result provides a simple and elegant method for computing the mutual information by simulation. As opposed to the conventional method, explicit measurements of histograms of the softoutputs are not necessary. In fact, online estimation is possible, and bits having different statistical properties need not be treated separately. As a direct application, the computation of extrinsic information transfer (EXIT) charts is considered.
Matched information rate codes for partial response channels
 IEEE TRANS. INFORM. THEORY
, 2005
"... In this paper we design capacityapproaching codes for partial response channels. The codes are constructed as concatenations of inner trellis codes and outer lowdensity paritycheck (LDPC) codes. Unlike previous constructions of trellis codes for partial response channels, we disregard any algebra ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
In this paper we design capacityapproaching codes for partial response channels. The codes are constructed as concatenations of inner trellis codes and outer lowdensity paritycheck (LDPC) codes. Unlike previous constructions of trellis codes for partial response channels, we disregard any algebraic properties (e.g., the minimum distance or the runlength limit) in our design of the trellis code. Our design is purely probabilistic in that we construct the inner trellis code to mimic the transition probabilities of a Markov process that achieves a high (capacityapproaching) information rate. Hence, we name it a matched information rate (MIR) design. We provide a set of 5 design rules for constructions of capacityapproaching MIR inner trellis codes. We optimize the outer LDPC code using density evolution tools specially modified to fit the superchannel consisting of the inner MIR trellis code concatenated with the partial response channel. Using this strategy, we design degree sequences of irregular LDPC codes whose noise tolerance thresholds are only fractions of a decibel away from the capacity. Examples of code constructions are shown for channels both with and without spectral nulls.
On the Capacity Loss due to Separation of Detection and Decoding
, 2002
"... The performance loss due to separation of detection and decoding on the binaryinput additive white Gaussian noise channel is quantified in terms of mutual information. Results are reported for both the codedivision multipleaccess (CDMA) channel in the large system limit and the intersymbol interf ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
The performance loss due to separation of detection and decoding on the binaryinput additive white Gaussian noise channel is quantified in terms of mutual information. Results are reported for both the codedivision multipleaccess (CDMA) channel in the large system limit and the intersymbol interference (ISI) channel. The results for CDMA rely on the replica method developed in statistical mechanics. It is shown that a previous result in [1] found for Gaussian input alphabet holds also for binary input alphabets. For the ISI channel, the performance loss is calculated via the BCJR algorithm. Comparisons are made to the capacity of separate detection and decoding using suboptimum detectors such as a decisionfeedback equalizer.
Determining and approaching achievable rates of binary intersymbol interference channels using multistage decoding
 IEEE Trans. Information Theory
, 2007
"... Abstract—By examining the achievable rates of a multistage decoding system on stationary ergodic channels, we derive lower bounds on the mutual information rate corresponding to independent and uniformly distributed (i.u.d.) inputs, also referred to as the i.u.d. information rate. For binary intersy ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
Abstract—By examining the achievable rates of a multistage decoding system on stationary ergodic channels, we derive lower bounds on the mutual information rate corresponding to independent and uniformly distributed (i.u.d.) inputs, also referred to as the i.u.d. information rate. For binary intersymbol interference (ISI) channels, we show that these bounds become tight as the number of decoding stages increases. Our analysis, which focuses on the marginal conditional output densities at each stage of decoding, provides an information rate corresponding to each stage. These rates underlie the design of multilevel coding schemes, based upon lowdensity paritycheck (LDPC) codes and message passing, that in combination with multistage decoding approach the i.u.d. information rate for binary ISI channels. We give example constructions for channel models that have been commonly used in magnetic recording. These examples demonstrate that the technique is very effective even for a small number of decoding stages. Index Terms—Bahl–Cocke–Jelinek–Raviv (BCJR) algorithm, coset codes, density evolution, finitestate channels, information rates, intersymbol interference (ISI) channels, lowdensity paritycheck (LDPC) codes, magnetic recording, multilevel coding, multistage decoding. I.
Achievable Information Rates and Coding For Mimo Systems over ISI Channels and FrequencySelective Fading Channels
 IEEE TRANSACSTIONS ON COMMUNICATIONS
, 2004
"... We propose a simulationbased method to compute the achievable information rates for general multipleinput multiple output (MIMO) intersymbol interference (ISI) channels with inputs chosen from a finite alphabet. This method is applicable to both deterministic and stochastic channels. As an exampl ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
We propose a simulationbased method to compute the achievable information rates for general multipleinput multiple output (MIMO) intersymbol interference (ISI) channels with inputs chosen from a finite alphabet. This method is applicable to both deterministic and stochastic channels. As an example of the stochastic MIMO ISI channels, we consider the multiantenna systems over frequencyselective fading channels, and quantify the improvement in the achievable information rates provided by the additional frequency diversity (for both ergodic and nonergodic cases). In addition, we consider the multiaccess multiantenna system and present some results on the achievable information rate region. As for the deterministic MIMO ISI channels, we use the binaryinput multitrack magnetic recording system as an example, which employs multiple write and read heads for data storage. Our results show that the multitrack recording channels have significant advantages over the singletrack channels, in terms of the achievable information rates when the intertrack interference is considered. We further consider practical coding schemes over both stochastic and deterministic MIMO ISI channels, and compare their performance with the information theoretical limits. Specifically, we demonstrate that the performance of the turbo coding/decoding scheme is only about 1.0 dB away from the informationtheoretical limits at a biterror rate of 10 5 for large interleaver lengths.
Matched information rate codes for binary ISI channels
 in Proceedings of IEEE International Symposium on Information Theory
, 2002
"... Abstract — We propose a coding/decoding strategy to approach the channel capacities for binary intersymbol interference (ISI) channels. The proposed codes are serially concatenated codes: inner matched information rate codes and outer irregular lowdensity paritycheck (LDPC) codes. The whole system ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Abstract — We propose a coding/decoding strategy to approach the channel capacities for binary intersymbol interference (ISI) channels. The proposed codes are serially concatenated codes: inner matched information rate codes and outer irregular lowdensity paritycheck (LDPC) codes. The whole system is iteratively decodable. I. SUMMARY Binary ISI channel models are appropriate models for information storage systems [1]. The behavior of such a channel can be represented by a trellis [2]. Here we describe a more general timeinvariant trellis model. At time � , the trellis has states, which are indexed by � � � � � �. Exactly � branches emanate from each state. A branch at time � is determined by a fourtuple � � � � �. Here, the two symbols and denote the two states connected by this branch; the symbol � � � � denotes a binary input vector; and the symbol denotes a realvalued output vector. We assume that and are determined uniquely by and. We assume that the trellis represents an indecomposable finite state machine. Throughout this paper, we assume that the initial state is given. The trellis can be considered as an encoder (with rate � � ) which transforms a (binary) sequence into a (realvalued) sequence. Assume that is transmitted through a known memoryless channel and noisy observation is received. Denote by � the conditional probability � � � �, where � is the decimal representation of. Denote by � � the collection of these conditional probabilities and assume � �. The (average) mutual information [3] between � and � is a function of, � � � �. We can prove that
Upper bounds on the capacities of noncontrollable finitestate channles using dynamic programming methods
 in Proc. International Symposium on Information Theory, Seoul, South Korea
, 2009
"... Noncontrollable finitestate channels (FSCs) are finitestate channels in which the user cannot control the channel state, i.e., the state evolves freely in time. Thus far, good upper bounds on the capacities as well as computable capacities of general noncontrollable FSCs with/without feedback ar ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Noncontrollable finitestate channels (FSCs) are finitestate channels in which the user cannot control the channel state, i.e., the state evolves freely in time. Thus far, good upper bounds on the capacities as well as computable capacities of general noncontrollable FSCs with/without feedback are unknown. Here we consider the delayed channel state as part of the channel input and then mathematically define the directed information between the new channel input (including the source and the delayed channel state) and the channel output. With this technique, computable upper bounds on the capacities of noncontrollable FSCs with/without feedback are developed. The upper bounds are achieved by conditional Markov sources, conditioned on the delayed feedback and the delayed state information. A dynamic programming method is proposed to optimize conditional Markov sources and the bounds are numerically computed by Monte Carlo techniques.
On nearcapacity coding systems for partialresponse channels
 in Proc. IEEE Int. Symp. Information Theory
, 2004
"... Abstract — We present a nearcapacity coding system for higherorder partialresponse channels, consisting of an outer set of interleaved lowdensity paritycheck codes, an inner rate1 shaping code, and a multistage decoder. The inner shaping code, which may be noninvertible, is designed to genera ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Abstract — We present a nearcapacity coding system for higherorder partialresponse channels, consisting of an outer set of interleaved lowdensity paritycheck codes, an inner rate1 shaping code, and a multistage decoder. The inner shaping code, which may be noninvertible, is designed to generate an output process similar to a binary Markov process that maximizes the mutual information for a given order. On the EPR4 channel, our system exhibits an iterative decoding threshold and a simulation BER of 10 −5 within 0.19 and 0.33 dB, respectively, of the informationtheoretic limit for a thirdorder input process. I.
Joint Iterative Decoding of LDPC Codes and Channels with Memory
, 2003
"... This paper considers the joint iterative decoding of irregular lowdensity paritycheck (LDPC) codes and channels with memory. It begins by introducing a new class of erasure channels with memory, known as generalizederasure channels. For these channels, a single parameter recursion for the density ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
This paper considers the joint iterative decoding of irregular lowdensity paritycheck (LDPC) codes and channels with memory. It begins by introducing a new class of erasure channels with memory, known as generalizederasure channels. For these channels, a single parameter recursion for the density evolution of the joint iterative decoder is derived. This provides a necessary and sucient condition for decoder convergence, and allows the algebraic construction of sequences of LDPC degree distributions. Under certain conditions, these sequences can achieve the symmetric information rate (SIR) of the channel using only iterative decoding. Example code sequences are given for two channels, and it is conjectured that they each achieve the respective SIR. Keywords: joint iterative decoding, erasure channel, capacityachieving, LDPC codes 1.