Results 1  10
of
366
A comparison of mechanisms for improving TCP performance over wireless links
 IEEE/ACM TRANSACTIONS ON NETWORKING
, 1997
"... Reliable transport protocols such as TCP are tuned to perform well in traditional networks where packet losses occur mostly because of congestion. However, networks with wireless and other lossy links also suffer from significant losses due to bit errors and handoffs. TCP responds to all losses by i ..."
Abstract

Cited by 770 (11 self)
 Add to MetaCart
Reliable transport protocols such as TCP are tuned to perform well in traditional networks where packet losses occur mostly because of congestion. However, networks with wireless and other lossy links also suffer from significant losses due to bit errors and handoffs. TCP responds to all losses by invoking congestion control and avoidance algorithms, resulting in degraded endtoend performance in wireless and lossy systems. In this paper, we compare several schemes designed to improve the performance of TCP in such networks. We classify these schemes into three broad categories: endtoend protocols, where loss recovery is performed by the sender; linklayer protocols, that provide local reliability; and splitconnection protocols, that break the endtoend connection into two parts at the base station. We present the results of several experiments performed in both LAN and WAN environments, using throughput and goodput as the metrics for comparison. Our results show that a reliable linklayer protocol that is TCPaware provides very good performance. Furthermore, it is possible to achieve good performance without splitting the endtoend connection at the base station. We also demonstrate that selective acknowledgments and explicit loss notifications result in significant performance improvements.
Effective Erasure Codes for Reliable Computer Communication Protocols
, 1997
"... Reliable communication protocols require that all the intended recipients of a message receive the message intact. Automatic Repeat reQuest (ARQ) techniques are used in unicast protocols, but they do not scale well to multicast protocols with large groups of receivers, since segment losses tend to b ..."
Abstract

Cited by 413 (14 self)
 Add to MetaCart
Reliable communication protocols require that all the intended recipients of a message receive the message intact. Automatic Repeat reQuest (ARQ) techniques are used in unicast protocols, but they do not scale well to multicast protocols with large groups of receivers, since segment losses tend to become uncorrelated thus greatly reducing the effectiveness of retransmissions. In such cases, Forward Error Correction (FEC) techniques can be used, consisting in the transmission of redundant packets (based on error correcting codes) to allow the receivers to recover from independent packet losses. Despite the widespread use of error correcting codes in many fields of information processing, and a general consensus on the usefulness of FEC techniques within some of the Internet protocols, very few actual implementations exist of the latter. This probably derives from the different types of applications, and from concerns related to the complexity of implementing such codes in software. To f...
Communication over fading channels with delay constraints
 IEEE Transactions on Information Theory
, 2002
"... We consider a user communicating over a fading channel with perfect channel state information. Data is assumed to arrive from some higher layer application and is stored in a buffer until it is transmitted. We study adapting the user's transmission rate and power based on the channel state informati ..."
Abstract

Cited by 168 (7 self)
 Add to MetaCart
We consider a user communicating over a fading channel with perfect channel state information. Data is assumed to arrive from some higher layer application and is stored in a buffer until it is transmitted. We study adapting the user's transmission rate and power based on the channel state information as well as the buffer occupancy; the objectives are to regulate both the longterm average transmission power and the average buffer delay incurred by the traffic. Two models for this situation are discussed; one corresponding to fixedlength/variablerate codewords and one corresponding to variablelength codewords. The tradeoff between the average delay and the average transmission power required for reliable communication is analyzed. A dynamic programming formulation is given to find all Pareto optimal power/delay operating points. We then quantify the behavior of this tradeoff in the regime of asymptotically large delay. In this regime we characterize simple buffer control policies which exhibit optimal characteristics. Connections to the delaylimited capacity and the expected capacity of fading channels are also discussed.
Quantized Frame Expansions with Erasures
 Applied and Computational Harmonic Analysis
, 2001
"... This paper places frames in a new setting, where some of the elements are deleted. Since proper subsets of fi'ames are sometimes them selves frames, a quantized frame expansion can be a useful representation even when some transform coefficients are lost in transmission. This yields robustness ..."
Abstract

Cited by 137 (18 self)
 Add to MetaCart
This paper places frames in a new setting, where some of the elements are deleted. Since proper subsets of fi'ames are sometimes them selves frames, a quantized frame expansion can be a useful representation even when some transform coefficients are lost in transmission. This yields robustness to losses in packet networks such as the Internet
Information Theory and Communication Networks: An Unconsummated Union
 IEEE Trans. Inform. Theory
, 1998
"... Information theory has not yet had a direct impact on networking, although there are similarities in concepts and methodologies that have consistently attracted the attention of researchers from both fields. In this paper, we review several topics that are related to communication networks and that ..."
Abstract

Cited by 133 (5 self)
 Add to MetaCart
Information theory has not yet had a direct impact on networking, although there are similarities in concepts and methodologies that have consistently attracted the attention of researchers from both fields. In this paper, we review several topics that are related to communication networks and that have an information theoretic flavor, including multiaccess protocols, timing channels, effective bandwidth of bursty data sources, deterministic constraints on datastreams, queueing theory, and switching networks. Keywords Communication networks, multiaccess, effective bandwidth, switching I. INTRODUCTION Information theory is the conscience of the theory of communication; it has defined the "playing field" within which communication systems can be studied and understood. It has provided the spawning grounds for the fields of coding, compression, encryption, detection, and modulation and it has enabled the design and evaluation of systems whose performance is pushing the limits of wha...
Progressive Image Coding for Noisy Channels
 IEEE SIGNAL PROCESSING LETTERS
, 1997
"... We cascade an existing image coder with carefully chosen error control coding, and thus produce a progressive image compression scheme whose performance on a noisy channel is significantly better than that of previously known techniques. The main idea is to trade off the available transmission rate ..."
Abstract

Cited by 125 (9 self)
 Add to MetaCart
We cascade an existing image coder with carefully chosen error control coding, and thus produce a progressive image compression scheme whose performance on a noisy channel is significantly better than that of previously known techniques. The main idea is to trade off the available transmission rate between source coding and channel coding in an efficient manner. This coding system is easy to implement and has acceptably low complexity. Furthermore, effectively no degradation due to channel noise can be detected; instead, the penalty paid due to channel noise is a reduction in source coding resolution. Detailed numerical comparisons are given that can serve as benchmarks for comparisons with future encoding schemes. For example, for the 512 512 Lena image, at a transmission rate of 1 b/pixel, and for binary symmetric channels with bit error probabilities 03 , 02 , and 01 , the proposed system outperforms previously reported results by at least 2.6, 2.8, and 8.9 dB, respectively.
Lowdensity paritycheck codes based on finite geometries: A rediscovery and new results
 IEEE Trans. Inform. Theory
, 2001
"... This paper presents a geometric approach to the construction of lowdensity paritycheck (LDPC) codes. Four classes of LDPC codes are constructed based on the lines and points of Euclidean and projective geometries over finite fields. Codes of these four classes have good minimum distances and thei ..."
Abstract

Cited by 121 (4 self)
 Add to MetaCart
This paper presents a geometric approach to the construction of lowdensity paritycheck (LDPC) codes. Four classes of LDPC codes are constructed based on the lines and points of Euclidean and projective geometries over finite fields. Codes of these four classes have good minimum distances and their Tanner graphs have girth T. Finitegeometry LDPC codes can be decoded in various ways, ranging from low to high decoding complexity and from reasonably good to very good performance. They perform very well with iterative decoding. Furthermore, they can be put in either cyclic or quasicyclic form. Consequently, their encoding can be achieved in linear time and implemented with simple feedback shift registers. This advantage is not shared by other LDPC codes in general and is important in practice. Finitegeometry LDPC codes can be extended and shortened in various ways to obtain other good LDPC codes. Several techniques of extension and shortening are presented. Long extended finitegeometry LDPC codes have been constructed and they achieve a performance only a few tenths of a decibel away from the Shannon theoretical limit with iterative decoding.
Universal SpaceTime Coding
 IEEE Trans. Inform. Theory
, 2003
"... A universal framework is developed for constructing fullrate and fulldiversity coherent spacetime codes for systems with arbitrary numbers of transmit and receive antennas. The proposed framework combines spacetime layering concepts with algebraic component codes optimized for singleinputsi ..."
Abstract

Cited by 112 (6 self)
 Add to MetaCart
A universal framework is developed for constructing fullrate and fulldiversity coherent spacetime codes for systems with arbitrary numbers of transmit and receive antennas. The proposed framework combines spacetime layering concepts with algebraic component codes optimized for singleinputsingleoutput (SISO) channels. Each component code is assigned to a "thread" in the spacetime matrix, allowing it thus full access to the channel spatial diversity in the absence of the other threads. Diophantine approximation theory is then used in order to make the different threads "transparent" to each other. Within this framework, a special class of signals which uses algebraic numbertheoretic constellations as component codes is thoroughly investigated. The lattice structure of the proposed numbertheoretic codes along with their minimal delay allow for polynomial complexity maximumlikelihood (ML) decoding using algorithms from lattice theory. Combining the design framework with the Cayley transform allows to construct full diversity differential and noncoherent spacetime codes. The proposed framework subsumes many of the existing codes in the literature, extends naturally to timeselective and frequency selective channels, and allows for more flexibility in the tradeoff between power efficiency, bandwidth efficiency, and receiver complexity. Simulation results that demonstrate the significant gains offered by the proposed codes are presented in certain representative scenarios.
The Z_4linearity of Kerdock, Preparata, Goethals, and related codes
, 2001
"... Certain notorious nonlinear binary codes contain more codewords than any known linear code. These include the codes constructed by NordstromRobinson, Kerdock, Preparata, Goethals, and DelsarteGoethals. It is shown here that all these codes can be very simply constructed as binary images under the ..."
Abstract

Cited by 108 (15 self)
 Add to MetaCart
Certain notorious nonlinear binary codes contain more codewords than any known linear code. These include the codes constructed by NordstromRobinson, Kerdock, Preparata, Goethals, and DelsarteGoethals. It is shown here that all these codes can be very simply constructed as binary images under the Gray map of linear codes over ¡ 4, the integers mod 4 (although this requires a slight modification of the Preparata and Goethals codes). The construction implies that all these binary codes are distance invariant. Duality in the ¡ 4 domain implies that the binary images have dual weight distributions. The Kerdock and ‘Preparata ’ codes are duals over ¡ 4 — and the NordstromRobinson code is selfdual — which explains why their weight distributions are dual to each other. The Kerdock and ‘Preparata ’ codes are ¡ 4analogues of firstorder ReedMuller and extended Hamming codes, respectively. All these codes are extended cyclic codes over ¡ 4, which greatly simplifies encoding and decoding. An algebraic harddecision decoding algorithm is given for the ‘Preparata ’ code and a Hadamardtransform softdecision decoding algorithm for the Kerdock code. Binary first and secondorder ReedMuller codes are also linear over ¡ 4, but extended Hamming codes of length n ≥ 32 and the