• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 164,195
Next 10 →

Turbo decoding of product codes based on the modified adaptive belief propagation algorithm

by Christophe Jego, Warren J. Gross - in Proc. of ISIT , 2007
"... Abstract—This paper introduces the Modified Adaptive Belief Propagation (m-ABP) algorithm, an innovative method for the turbo decoding of product codes based on BCH component codes. The Adaptive Belief Propagation algorithm of Jiang and Narayanan is simplified by moving the matrix adaptation step ou ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
Abstract—This paper introduces the Modified Adaptive Belief Propagation (m-ABP) algorithm, an innovative method for the turbo decoding of product codes based on BCH component codes. The Adaptive Belief Propagation algorithm of Jiang and Narayanan is simplified by moving the matrix adaptation step

Loopy Belief Propagation for Approximate Inference: An Empirical Study

by Kevin P. Murphy, Yair Weiss, Michael I. Jordan - In Proceedings of Uncertainty in AI , 1999
"... Recently, researchers have demonstrated that "loopy belief propagation" --- the use of Pearl's polytree algorithm in a Bayesian network with loops --- can perform well in the context of error-correcting codes. The most dramatic instance of this is the near Shannon-limit performa ..."
Abstract - Cited by 680 (18 self) - Add to MetaCart
-limit performance of "Turbo Codes" --- codes whose decoding algorithm is equivalent to loopy belief propagation in a chain-structured Bayesian network. In this paper we ask: is there something special about the error-correcting code context, or does loopy propagation work as an approximate

Iterative decoding of binary block and convolutional codes

by Joachim Hagenauer, Elke Offer, Lutz Papke - IEEE Trans. Inform. Theory , 1996
"... Abstract- Iterative decoding of two-dimensional systematic convolutional codes has been termed “turbo ” (de)coding. Using log-likelihood algebra, we show that any decoder can he used which accepts soft inputs-including a priori values-and delivers soft outputs that can he split into three terms: the ..."
Abstract - Cited by 600 (43 self) - Add to MetaCart
/convolutional component codes, interleaver sizes less than 1000 and for three to six iterations. Index Terms- Concatenated codes, product codes, iterative decoding, “soft-inlsoft-out ” decoder, “turbo ” (de)coding.

Factor Graphs and the Sum-Product Algorithm

by Frank R. Kschischang, Brendan J. Frey, Hans-Andrea Loeliger - IEEE TRANSACTIONS ON INFORMATION THEORY , 1998
"... A factor graph is a bipartite graph that expresses how a "global" function of many variables factors into a product of "local" functions. Factor graphs subsume many other graphical models including Bayesian networks, Markov random fields, and Tanner graphs. Following one simple c ..."
Abstract - Cited by 1787 (72 self) - Add to MetaCart
be derived as specific instances of the sum-product algorithm, including the forward/backward algorithm, the Viterbi algorithm, the iterative "turbo" decoding algorithm, Pearl's belief propagation algorithm for Bayesian networks, the Kalman filter, and certain fast Fourier transform algorithms.

Fusion, Propagation, and Structuring in Belief Networks

by Judea Pearl - ARTIFICIAL INTELLIGENCE , 1986
"... Belief networks are directed acyclic graphs in which the nodes represent propositions (or variables), the arcs signify direct dependencies between the linked propositions, and the strengths of these dependencies are quantified by conditional probabilities. A network of this sort can be used to repre ..."
Abstract - Cited by 482 (8 self) - Add to MetaCart
with the task of fusing and propagating the impacts of new information through the networks in such a way that, when equilibrium is reached, each proposition will be assigned a measure of belief consistent with the axioms of probability theory. It is shown that if the network is singly connected (e.g. tree

The Capacity of Low-Density Parity-Check Codes Under Message-Passing Decoding

by Thomas J. Richardson, Rüdiger L. Urbanke , 2001
"... In this paper, we present a general method for determining the capacity of low-density parity-check (LDPC) codes under message-passing decoding when used over any binary-input memoryless channel with discrete or continuous output alphabets. Transmitting at rates below this capacity, a randomly chos ..."
Abstract - Cited by 569 (9 self) - Add to MetaCart
case of belief-propagation decoders, we provide an effective algorithm to determine the corresponding capacity to any desired degree of accuracy. The ideas presented in this paper are broadly applicable and extensions of the general method to low-density parity-check codes over larger alphabets, turbo

Good Error-Correcting Codes based on Very Sparse Matrices

by David J.C. MacKay , 1999
"... We study two families of error-correcting codes defined in terms of very sparse matrices. "MN" (MacKay--Neal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The ..."
Abstract - Cited by 741 (23 self) - Add to MetaCart
. The decoding of both codes can be tackled with a practical sum-product algorithm. We prove that these codes are "very good," in that sequences of codes exist which, when optimally decoded, achieve information rates up to the Shannon limit. This result holds not only for the binary-symmetric channel

Codes and Decoding on General Graphs

by Niclas Wiberg , 1996
"... Iterative decoding techniques have become a viable alternative for constructing high performance coding systems. In particular, the recent success of turbo codes indicates that performance close to the Shannon limit may be achieved. In this thesis, it is showed that many iterative decoding algorithm ..."
Abstract - Cited by 359 (1 self) - Add to MetaCart
Iterative decoding techniques have become a viable alternative for constructing high performance coding systems. In particular, the recent success of turbo codes indicates that performance close to the Shannon limit may be achieved. In this thesis, it is showed that many iterative decoding

Wavelets and Subband Coding

by Martin Vetterli, Jelena Kovačević , 2007
"... ..."
Abstract - Cited by 608 (32 self) - Add to MetaCart
Abstract not found

A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm

by Martin Riedmiller, Heinrich Braun - IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS , 1993
"... A new learning algorithm for multilayer feedforward networks, RPROP, is proposed. To overcome the inherent disadvantages of pure gradient-descent, RPROP performs a local adaptation of the weight-updates according to the behaviour of the errorfunction. In substantial difference to other adaptive tech ..."
Abstract - Cited by 917 (34 self) - Add to MetaCart
A new learning algorithm for multilayer feedforward networks, RPROP, is proposed. To overcome the inherent disadvantages of pure gradient-descent, RPROP performs a local adaptation of the weight-updates according to the behaviour of the errorfunction. In substantial difference to other adaptive
Next 10 →
Results 1 - 10 of 164,195
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University