Results 1  10
of
277
Effective Erasure Codes for Reliable Computer Communication Protocols
, 1997
"... Reliable communication protocols require that all the intended recipients of a message receive the message intact. Automatic Repeat reQuest (ARQ) techniques are used in unicast protocols, but they do not scale well to multicast protocols with large groups of receivers, since segment losses tend to b ..."
Abstract

Cited by 473 (14 self)
 Add to MetaCart
Reliable communication protocols require that all the intended recipients of a message receive the message intact. Automatic Repeat reQuest (ARQ) techniques are used in unicast protocols, but they do not scale well to multicast protocols with large groups of receivers, since segment losses tend to become uncorrelated thus greatly reducing the effectiveness of retransmissions. In such cases, Forward Error Correction (FEC) techniques can be used, consisting in the transmission of redundant packets (based on error correcting codes) to allow the receivers to recover from independent packet losses. Despite the widespread use of error correcting codes in many fields of information processing, and a general consensus on the usefulness of FEC techniques within some of the Internet protocols, very few actual implementations exist of the latter. This probably derives from the different types of applications, and from concerns related to the complexity of implementing such codes in software. To f...
On the Capacity of Secure Network Coding
"... We consider the problem of using a multicast network code to transmit information securely in the presence of a "wiretap " adversary who can eavesdrop on a bounded number of network edges. Cai & Yeung (ISIT, 2002) gave a method to alter any given linear network code into a ne ..."
Abstract

Cited by 63 (2 self)
 Add to MetaCart
(Show Context)
We consider the problem of using a multicast network code to transmit information securely in the presence of a &quot;wiretap &quot; adversary who can eavesdrop on a bounded number of network edges. Cai & Yeung (ISIT, 2002) gave a method to alter any given linear network code into a new code that is secure. However, their construction is in general inefficient, and requires a very large field size; in many cases this is much greater than the field size required by standard network code construction algorithms to achieve the mincut capacity (without a security guarantee). In this paper we generalize and simplify the method of Cai & Yeung, and show that the problem of making a linear network code secure is equivalent to the problem of finding a linear code with certain generalized distance properties. We show that if we give up a small amount of overall capacity, then a random code achieves these properties using a much smaller field size in some cases a field of constant size suffices than the construction of Cai & Yeung. We add further support to this approach by showing that if we are not willing to give up any capacity, then a large field size may sometimes be required to achieve security.
Testing monotone highdimensional distributions
 In STOC
, 2005
"... A monotone distribution P over a (partially) ordered domain assigns higher probability to y than to x if y ≥ x in the order. We study several natural problems concerning testing properties of monotone distributions over the ndimensional Boolean cube, given access to random draws from the distributi ..."
Abstract

Cited by 37 (12 self)
 Add to MetaCart
(Show Context)
A monotone distribution P over a (partially) ordered domain assigns higher probability to y than to x if y ≥ x in the order. We study several natural problems concerning testing properties of monotone distributions over the ndimensional Boolean cube, given access to random draws from the distribution being tested. We give a poly(n)time algorithm for testing whether a monotone distribution is equivalent to or ɛfar (in the L1 norm) from the uniform distribution. A key ingredient of the algorithm is a generalization of a known isoperimetric inequality for the Boolean cube. We also introduce a method for proving lower bounds on various problems of testing monotone distributions over the ndimensional Boolean cube, based on a new decomposition technique for monotone distributions. We use this method to show that our uniformity testing algorithm is optimal up to polylog(n) factors, and also to give exponential lower bounds on the complexity of several other problems, including testing whether a monotone distribution is identical to or ɛfar from a fixed known monotone product distribution and approximating the entropy of an unknown monotone distribution. 1
Codes for Asymmetric LimitedMagnitude Errors with Application to MultiLevel Flash Memories
"... Several physical effects that limit the reliability and performance of Multilevel Flash Memories induce errors that have low magnitudes and are dominantly asymmetric. This paper studies block codes for asymmetric limitedmagnitude errors over qary channels. We propose code constructions and bounds ..."
Abstract

Cited by 30 (15 self)
 Add to MetaCart
Several physical effects that limit the reliability and performance of Multilevel Flash Memories induce errors that have low magnitudes and are dominantly asymmetric. This paper studies block codes for asymmetric limitedmagnitude errors over qary channels. We propose code constructions and bounds for such channels when the number of errors is bounded by t and the error magnitudes are bounded by ℓ. The constructions utilize known codes for symmetric errors, over small alphabets, to protect largealphabet symbols from asymmetric limitedmagnitude errors. The encoding and decoding of these codes are performed over the small alphabet whose size depends only on the maximum error magnitude and is independent of the alphabet size of the outer code. Moreover, the size of the codes is shown to exceed the sizes of known codes (for related error models), and asymptotic rateoptimality results are proved. Extensions of the construction are proposed to accommodate variations on the error model and to include systematic codes as a benefit to practical implementation.
MDPCMcEliece: New McEliece Variants from Moderate Density ParityCheck Codes
"... Abstract. Cryptography based on coding theory is believed to resist to quantum attacks (all cryptosystems based on factoring/discrete logarithm can be quantum attacked in polynomial time). The McEliece cryptosystem is the oldest codebased cryptosystem and its security relies on two problems: the in ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
Abstract. Cryptography based on coding theory is believed to resist to quantum attacks (all cryptosystems based on factoring/discrete logarithm can be quantum attacked in polynomial time). The McEliece cryptosystem is the oldest codebased cryptosystem and its security relies on two problems: the indistinguishability of the code family and the hardness of decoding random linear codes. The former is usually the weakest one. The main drawback of this cryptosystem regards its huge publickeys. Recently, several attempts to reduce its keysize have been proposed. Almost all of them were successfully broken due to the additional algebraic structure used to reduce the keys. In this work, we propose McEliece variants from Moderate Density ParityCheck codes. These codes are LDPC codes of higher density than what is usually adopted for telecommunication solutions. We show that our proposal strongly strengthens the security against distinguishing attacks and also provides extremely compactkeys. Under a reasonable assumption, MDPC codes reduce the distinguishing problem to decoding a linear code and thus the security of our proposal relies only on a well studied codingtheory problem. Furthermore, using a quasicyclic structure, we provide the smallest publickeys for codebased cryptosystem. For 80bits of security, the publickey has only 4800 bits. In summary, this represents the most competitive codebased cryptosystem ever proposed and is a strong alternative for traditional cryptography.
Asymptotically Good Ideal Linear Secret Sharing with Strong Multiplication over Any Finite Field
 Proceeding of 29th Annual IACR CRYPTO
, 2009
"... Abstract. This work deals with “MPCfriendly ” linear secret sharing schemes (LSSS), a mathematical primitive upon which secure multiparty computation (MPC) can be based and which was introduced by Cramer, Damgaard and Maurer (EUROCRYPT 2000). Chen and Cramer proposed a special class of such schem ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
(Show Context)
Abstract. This work deals with “MPCfriendly ” linear secret sharing schemes (LSSS), a mathematical primitive upon which secure multiparty computation (MPC) can be based and which was introduced by Cramer, Damgaard and Maurer (EUROCRYPT 2000). Chen and Cramer proposed a special class of such schemes that is constructed from algebraic geometry and that enables efficient secure multiparty computation over fixed finite fields (CRYPTO 2006). We extend this in four ways. First, we propose an abstract codingtheoretic framework in which this class of schemes and its (asymptotic) properties can be cast and analyzed. Second, we show that for every finite field Fq, there exists an infinite family of LSSS over Fq that is asymptotically good in the following sense: the schemes are “ideal, ” i.e., each share consists of a single Fqelement, and the schemes have tstrong multiplication on n players, where the corruption tolerance 3t n−1 tends to a constant ν(q) with 0 < ν(q) < 1 when n
Asymmetric quantum LDPC codes
 In Proc. IEEE ISIT
"... Abstract — Recently, quantum errorcorrecting codes were proposed that capitalize on the fact that many physical error models lead to a significant asymmetry between the probabilities for bit flip and phase flip errors. An example for a channel which exhibits such asymmetry is the combined amplitude ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
Abstract — Recently, quantum errorcorrecting codes were proposed that capitalize on the fact that many physical error models lead to a significant asymmetry between the probabilities for bit flip and phase flip errors. An example for a channel which exhibits such asymmetry is the combined amplitude damping and dephasing channel, where the probabilities of bit flips and phase flips can be related to relaxation and dephasing time, respectively. We give systematic constructions of asymmetric quantum stabilizer codes that exploit this asymmetry. Our approach is based on a CSS construction that combines BCH and finite geometry LDPC codes. I.
An ErrorResilient and Tunable Distributed Indexing Scheme for Wireless Data Broadcast
 IEEE Trans. Knowledge and Data Eng
, 2006
"... Access efficiency and energy conservation are two critical performance concerns in a wireless data broadcast system. We propose in this paper a novel parameterized index called the exponential index that has a linear yet distributed structure for wireless data broadcast. Based on two tuning knobs, i ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
Access efficiency and energy conservation are two critical performance concerns in a wireless data broadcast system. We propose in this paper a novel parameterized index called the exponential index that has a linear yet distributed structure for wireless data broadcast. Based on two tuning knobs, index base and chunk size, the exponential index can be tuned to optimize the access latency with the tuning time bounded by a given limit, and vice versa. The client access algorithm for the exponential index under unreliable broadcast is described. A performance analysis of the exponential index is provided. Extensive ns2 based simulation experiments are conducted to evaluate the performance under various link error probabilities. Simulation results show that the exponential index substantially outperforms the stateoftheart indexes. In particular, it is more resilient to link errors and achieves more performance advantages from index caching. The results also demonstrate its great flexibility in trading access latency with tuning time.