Results 11  20
of
93
A LowComplexity MessagePassing Algorithm for Reduced Routing Congestion in LDPC Decoders
"... Abstract—A lowcomplexity messagepassing algorithm, called SplitRow Threshold, is used to implement lowdensity paritycheck (LDPC) decoders with reduced layout routing congestion. Five LDPC decoders that are compatible with the 10GBASET standard are implemented using MinSum Normalized and MinSum ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
Abstract—A lowcomplexity messagepassing algorithm, called SplitRow Threshold, is used to implement lowdensity paritycheck (LDPC) decoders with reduced layout routing congestion. Five LDPC decoders that are compatible with the 10GBASET standard are implemented using MinSum Normalized and MinSum SplitRow Threshold algorithms. All decoders are built using a standard cell design flow and include all steps through the generation of GDS II layout. An = 16 decoder achieves improvements in area, throughput, and energy efficiency of 4.1 times, 3.3 times, and 4.8 times, respectively, compared to a MinSum Normalized implementation. Postlayout results show that a fully parallel =16decoder in 65nm CMOS operates at 195 MHz at 1.3 V with an average throughput of 92.8 Gbits/s with early termination enabled. Lowpower operation at 0.7 V gives a worst case throughput of 6.5 Gbits/s—just above the 10GBASET requirement—and an estimated average power of 62 mW, resulting in 9.5 pJ/bit. At 0.7 V with early termination enabled, the throughput is 16.6 Gbits/s, and the energy is 3.7 pJ/bit, which is 5.8 lower than the previously reported lowest energy per bit. The decoder area is 4.84 mm2 with a final postlayout area utilization of 97%. Index Terms—Full parallel, high throughput, lowdensity parity check (LDPC), low power, message passing, min sum, nanometer, 10GBASET, 65nm CMOS, 802.3an. I.
Zigangirov, “A comparison between LDPC block and convolutional codes
 Proc. Inform. Theory and App. Workshop
, 2006
"... Abstract — LDPC convolutional codes have been shown to be capable of achieving the same capacityapproaching performance as LDPC block codes with iterative messagepassing decoding. However, traditional means of comparing block and convolutional codes tied to the implementation complexity of trellis ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
(Show Context)
Abstract — LDPC convolutional codes have been shown to be capable of achieving the same capacityapproaching performance as LDPC block codes with iterative messagepassing decoding. However, traditional means of comparing block and convolutional codes tied to the implementation complexity of trellisbased decoding are irrelevant for messagepassing decoders. In this paper, we undertake a comparison of LDPC block and convolutional codes based on several factors: encoding complexity, decoding computational complexity, decoding hardware complexity, decoding memory requirements, decoding delay, and VLSI implementation requirements. I.
Controlling LDPC Absorbing Sets via the Null Space of the Cycle Consistency Matrix
 In Proc. IEEE Int. Conf. on Comm. (ICC
, 2011
"... Abstract — This paper focuses on controlling absorbing sets for a class of regular LDPC codes, known as separable, circulantbased (SCB) codes. For a specified circulant matrix, SCB codes all share a common mother matrix and include arraybased LDPC codes and many common quasicyclic codes. SCB codes ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
(Show Context)
Abstract — This paper focuses on controlling absorbing sets for a class of regular LDPC codes, known as separable, circulantbased (SCB) codes. For a specified circulant matrix, SCB codes all share a common mother matrix and include arraybased LDPC codes and many common quasicyclic codes. SCB codes retain standard properties of quasicyclic LDPC codes such as girth, code structure, and compatibility with existing highthroughput hardware implementations. This paper uses a cycle consistency matrix (CCM) for each absorbing set of interest in an SCB LDPC code. For an absorbing set to be present in an SCB LDPC code, the associated CCM must not be full columnrank. Our approach selects rows and columns from the SCB mother matrix to systematically eliminate dominant absorbing sets by forcing the associated CCMs to be full columnrank. Simulation results demonstrate that the new codes have steeper errorfloor slopes and provide at least one order of magnitude of improvement in the low FER region. Identifying absorbingsetspectrum equivalence classes within the family of SCB codes with a specified circulant matrix significantly reduces the search space of possible code matrices. I.
Deriving good LDPC convolutional codes from LDPC block codes
 IEEE TRANS. INF. THEORY
, 2011
"... ..."
TreeBased Construction of LDPC Codes Having Good Pseudocodeword Weights
, 2005
"... We present a treebased construction of LDPC codes that have minimum pseudocodeword weight equal to or almost equal to the minimum distance, and perform well with iterative decoding. The construction involves enumerating a dregular tree for a fixed number of layers and employing a connection algori ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
We present a treebased construction of LDPC codes that have minimum pseudocodeword weight equal to or almost equal to the minimum distance, and perform well with iterative decoding. The construction involves enumerating a dregular tree for a fixed number of layers and employing a connection algorithm based on permutations or mutually orthogonal Latin squares to close the tree. Methods are presented for degrees d = ps and d = ps +1, for p a prime. One class corresponds to the wellknown finitegeometry and finite generalized quadrangle LDPC codes; the other codes presented are new. We also present some bounds on pseudocodeword weight for pary LDPC codes. Treating these codes as pary LDPC codes rather than binary LDPC codes improves their rates, minimum distances, and pseudocodeword weights.
Towards a GBit/s programmable decoder for LDPC convolutional codes
 In Proc. IEEE International Symposium on Circuits and Systems (ISCAS
, 2007
"... Abstract — We analyze the decoding algorithm for regular timeinvariant LDPC convolutional codes as a 3D signal processing scheme and derive several parallelization concepts, which were used to design a novel lowcomplexity programmable decoder architecture with throughput in the range of 1 Gbit/s at ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
(Show Context)
Abstract — We analyze the decoding algorithm for regular timeinvariant LDPC convolutional codes as a 3D signal processing scheme and derive several parallelization concepts, which were used to design a novel lowcomplexity programmable decoder architecture with throughput in the range of 1 Gbit/s at moderate system clock frequencies. The synthesis results indicate that the decoder requires relatively small areas, even when high levels of parallelism are used. I.
Scaling behavior of Convolutional LDPC ensembles over the BEC
 in Proc. IEEE International Symposium on Information Theory (ISIT
, 2011
"... AbstractWe study the scaling behavior of coupled sparse graph codes over the binary erasure channel. In particular, let 2L+1 be the length of the coupled chain, let M be the number of variables in each of the 2L + 1 local copies, let be the number of iterations, let Pb denote the bit error probabi ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
(Show Context)
AbstractWe study the scaling behavior of coupled sparse graph codes over the binary erasure channel. In particular, let 2L+1 be the length of the coupled chain, let M be the number of variables in each of the 2L + 1 local copies, let be the number of iterations, let Pb denote the bit error probability, and let denote the channel parameter. We are interested in how these quantities scale when we let the blocklength (2L + 1)M tend to infinity. Based on empirical evidence we show that the threshold saturation phenomenon is rather stable with respect to the scaling of the various parameters and we formulate some general rules of thumb which can serve as a guide for the design of coding systems based on coupled graphs.
LDPC Absorbing Sets, the Null Space of the Cycle Consistency Matrix, and Tanners Constructions
 In IEEE Conf. on Info. Theory and its Appl
, 2011
"... Abstract—Dolecek et al. introduced the cycle consistency condition, which is a necessary condition for cycles – and thus the absorbing sets that contain them – to be present in separable circulantbased (SCB) LDPC codes. This paper introduces a cycle consistency matrix (CCM) for each possible absorb ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
(Show Context)
Abstract—Dolecek et al. introduced the cycle consistency condition, which is a necessary condition for cycles – and thus the absorbing sets that contain them – to be present in separable circulantbased (SCB) LDPC codes. This paper introduces a cycle consistency matrix (CCM) for each possible absorbing set in an SCB LDPC code. The CCM efficiently enforces the cycle consistency condition for all cycles in a specified absorbing set by spanning its associated binary cycle space. Under certain conditions, a CCM not having full column rank is a necessary and sufficient condition for the LDPC code to contain the absorbing set associated with that CCM. This paper uses the CCM approach to carefully analyze LDPC codes based on the Tanner construction for r = 4 rows of submatrices (i.e., Tannerconstruction LDPC codes with column weight 4). I.
Towards Improved LDPC Code Designs Using Absorbing Set Spectrum Properties
 In Proc. of 6th International symposium on
, 2010
"... Abstract—This paper focuses on methods for a systematic modification of the parity check matrix of regular LDPC codes for improved performance in the low BER region (i.e., the error floor). A judicious elimination of dominant absorbing sets strictly improves the absorbing set spectrum and thereby im ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
(Show Context)
Abstract—This paper focuses on methods for a systematic modification of the parity check matrix of regular LDPC codes for improved performance in the low BER region (i.e., the error floor). A judicious elimination of dominant absorbing sets strictly improves the absorbing set spectrum and thereby improves the code performance. This absorbing set elimination is accomplished without compromising code properties and parameters such as the girth, node degree, and the structure of the parity check matrix. For a representative class of practical codes we substantiate theoretical analysis with experimental results obtained in the low BER region. Our results demonstrate at least an order of magnitude improvement of the error floor relative to the original code designs. Given that the conventional code parameters remain intact, the new code can easily be implemented on the existing software or hardware platforms employing highthroughput, compact architectures. As such, the proposed approach provides a step towards the improved code design that is compatible with practical implementation constraints. I.