Results 1  10
of
4,827
Explicit CapacityAchieving ListDecodable Codes
 In Proceedings of the 38th Annual ACM Symposium on Theory of Computing (STOC
, 2006
"... For every 0 < R < 1 and ε> 0, we present an explicit construction of errorcorrecting codes of rate R that can be list decoded in polynomial time up to a fraction (1 − R − ε) of errors. These codes achieve the “capacity ” for decoding from adversarial errors, i.e., achieve the optimal trade ..."
Abstract

Cited by 26 (9 self)
 Add to MetaCart
For every 0 < R < 1 and ε> 0, we present an explicit construction of errorcorrecting codes of rate R that can be list decoded in polynomial time up to a fraction (1 − R − ε) of errors. These codes achieve the “capacity ” for decoding from adversarial errors, i.e., achieve the optimal
The Complexity of Local List Decoding
"... We study the complexity of locally listdecoding binary error correcting codes with good parameters (that are polynomially related to information theoretic bounds). We show that computing majority over Θ(1/ǫ) bits is essentially equivalent to locally listdecoding binary codes from relative distance ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We study the complexity of locally listdecoding binary error correcting codes with good parameters (that are polynomially related to information theoretic bounds). We show that computing majority over Θ(1/ǫ) bits is essentially equivalent to locally listdecoding binary codes from relative distance
Iterative decoding of binary block and convolutional codes
 IEEE TRANS. INFORM. THEORY
, 1996
"... Iterative decoding of twodimensional systematic convolutional codes has been termed “turbo” (de)coding. Using loglikelihood algebra, we show that any decoder can he used which accepts soft inputsincluding a priori valuesand delivers soft outputs that can he split into three terms: the soft chann ..."
Abstract

Cited by 610 (43 self)
 Add to MetaCart
channel and a priori inputs, and the extrinsic value. The extrinsic value is used as an a priori value for the next iteration. Decoding algorithms in the loglikelihood domain are given not only for convolutional codes hut also for any linear binary systematic block code. The iteration is controlled by a
Concatenated codes can achieve listdecoding capacity
"... We prove that binary linear concatenated codes with an outer algebraic code (specifically, a folded ReedSolomon code) and independently and randomly chosen linear inner codes achieve the listdecoding capacity with high probability. In particular, for any 0 < ρ < 1/2 and ε> 0, there exist ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
We prove that binary linear concatenated codes with an outer algebraic code (specifically, a folded ReedSolomon code) and independently and randomly chosen linear inner codes achieve the listdecoding capacity with high probability. In particular, for any 0 < ρ < 1/2 and ε> 0
Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2000
"... We present a unifying framework for studying the solution of multiclass categorization problems by reducing them to multiple binary problems that are then solved using a marginbased binary learning algorithm. The proposed framework unifies some of the most popular approaches in which each class ..."
Abstract

Cited by 561 (20 self)
 Add to MetaCart
is compared against all others, or in which all pairs of classes are compared to each other, or in which output codes with errorcorrecting properties are used. We propose a general method for combining the classifiers generated on the binary problems, and we prove a general empirical multiclass loss bound
Near Shannon limit errorcorrecting coding and decoding
, 1993
"... Abstract This paper deals with a new class of convolutional codes called Turbocodes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The TurboCode encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes and the associated ..."
Abstract

Cited by 1776 (6 self)
 Add to MetaCart
and the associated decoder, using a feedback decoding rule, is implemented as P pipelined identical elementary decoders. Consider a binary rate R=1/2 convolutional encoder with constraint length K and memory M=K1. The input to the encoder at time k is a bit dk and the corresponding codeword
Good ErrorCorrecting Codes based on Very Sparse Matrices
, 1999
"... We study two families of errorcorrecting codes defined in terms of very sparse matrices. "MN" (MacKayNeal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The ..."
Abstract

Cited by 750 (23 self)
 Add to MetaCart
. The decoding of both codes can be tackled with a practical sumproduct algorithm. We prove that these codes are "very good," in that sequences of codes exist which, when optimally decoded, achieve information rates up to the Shannon limit. This result holds not only for the binarysymmetric channel
Solving multiclass learning problems via errorcorrecting output codes
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 1995
"... Multiclass learning problems involve nding a de nition for an unknown function f(x) whose range is a discrete set containing k>2values (i.e., k \classes"). The de nition is acquired by studying collections of training examples of the form hx i;f(x i)i. Existing approaches to multiclass l ..."
Abstract

Cited by 726 (8 self)
 Add to MetaCart
learning problems include direct application of multiclass algorithms such as the decisiontree algorithms C4.5 and CART, application of binary concept learning algorithms to learn individual binary functions for each of the k classes, and application of binary concept learning algorithms with distributed
Better binary listdecodable codes via multilevel concatenation
 In Proceedings of the 11th International Workshop on Randomization and Computation (RANDOM
, 2007
"... Abstract. We give a polynomial time construction of binary codes with the best currently known tradeoff between rate and errorcorrection radius. Specifically, we obtain linear codes over fixed alphabets that can be list decoded in polynomial time up to the so called BlokhZyablov bound. Our work b ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
Abstract. We give a polynomial time construction of binary codes with the best currently known tradeoff between rate and errorcorrection radius. Specifically, we obtain linear codes over fixed alphabets that can be list decoded in polynomial time up to the so called BlokhZyablov bound. Our work
Eraser: a dynamic data race detector for multithreaded programs
 ACM Transaction of Computer System
, 1997
"... Multithreaded programming is difficult and error prone. It is easy to make a mistake in synchronization that produces a data race, yet it can be extremely hard to locate this mistake during debugging. This paper describes a new tool, called Eraser, for dynamically detecting data races in lockbased ..."
Abstract

Cited by 688 (2 self)
 Add to MetaCart
Multithreaded programming is difficult and error prone. It is easy to make a mistake in synchronization that produces a data race, yet it can be extremely hard to locate this mistake during debugging. This paper describes a new tool, called Eraser, for dynamically detecting data races in lock
Results 1  10
of
4,827