Results 1  10
of
26,104
Maximal codeword lengths in Huffman codes
 Computers & Mathematics with Applications
, 2000
"... computers & ..."
FER PREDICTIONWITH VARIABLE CODEWORD LENGTH
"... Frame error rate (FER) prediction in wireless communication systems is an important tool with applications to system level simulations and link adaptation, among others. Although in realistic communication scenarios it is expected to have codewords of different lengths, previous work on FER predict ..."
Abstract
 Add to MetaCart
Frame error rate (FER) prediction in wireless communication systems is an important tool with applications to system level simulations and link adaptation, among others. Although in realistic communication scenarios it is expected to have codewords of different lengths, previous work on FER
Mean Codeword Lengths and Their Correspondence with Entropy Measures
"... Abstract—The objective of the present communication is to develop new genuine exponentiated mean codeword lengths and to study deeply the problem of correspondence between well known measures of entropy and mean codeword lengths. With the help of some standard measures of entropy, we have illustrate ..."
Abstract
 Add to MetaCart
Abstract—The objective of the present communication is to develop new genuine exponentiated mean codeword lengths and to study deeply the problem of correspondence between well known measures of entropy and mean codeword lengths. With the help of some standard measures of entropy, we have
Universal Coding with Minimum Probability of Codeword Length Overflow
 IEEE Trans. Information Theory
, 1991
"... AbstractLossless blocktovariable length source coding is studied for finitestate, finitealphabet sources. We aim to minimize the probability that the normalized length of the codeword will exceed a given threshold B, subject to the Kraft inequality. It is shown that the LempelZiv (LZ) algorit ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
AbstractLossless blocktovariable length source coding is studied for finitestate, finitealphabet sources. We aim to minimize the probability that the normalized length of the codeword will exceed a given threshold B, subject to the Kraft inequality. It is shown that the LempelZiv (LZ
Scheduling and Codeword Length Optimization in Time Varying Wireless Networks
, 2007
"... In this paper, a downlink scenario in which a singleantenna base station communicates with K single antenna users, over a timecorrelated fading channel, is considered. It is assumed that channel state information is perfectly known at each receiver, while the statistical characteristics of the fad ..."
Abstract
 Add to MetaCart
of the fading process and the fading gain at the beginning of each frame are known to the transmitter. By evaluating the random coding error exponent of the timecorrelated fading channel, it is shown that there is an optimal codeword length which maximizes the throughput. The throughput of the conventional
Asymptotic Properties on Codeword Lengths of an Optimal FV Code for General Sources
"... © 2005 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
© 2005 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be
Near Shannon limit errorcorrecting coding and decoding
, 1993
"... Abstract This paper deals with a new class of convolutional codes called Turbocodes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The TurboCode encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes and the associated ..."
Abstract

Cited by 1738 (5 self)
 Add to MetaCart
and the associated decoder, using a feedback decoding rule, is implemented as P pipelined identical elementary decoders. Consider a binary rate R=1/2 convolutional encoder with constraint length K and memory M=K1. The input to the encoder at time k is a bit dk and the corresponding codeword
Solving multiclass learning problems via errorcorrecting output codes
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 1995
"... Multiclass learning problems involve nding a de nition for an unknown function f(x) whose range is a discrete set containing k>2values (i.e., k \classes"). The de nition is acquired by studying collections of training examples of the form hx i;f(x i)i. Existing approaches to multiclass l ..."
Abstract

Cited by 730 (8 self)
 Add to MetaCart
Multiclass learning problems involve nding a de nition for an unknown function f(x) whose range is a discrete set containing k>2values (i.e., k \classes"). The de nition is acquired by studying collections of training examples of the form hx i;f(x i)i. Existing approaches to multiclass learning problems include direct application of multiclass algorithms such as the decisiontree algorithms C4.5 and CART, application of binary concept learning algorithms to learn individual binary functions for each of the k classes, and application of binary concept learning algorithms with distributed output representations. This paper compares these three approaches to a new technique in which errorcorrecting codes are employed as a distributed output representation. We show that these output representations improve the generalization performance of both C4.5 and backpropagation on a wide range of multiclass learning tasks. We also demonstrate that this approach is robust with respect to changes in the size of the training sample, the assignment of distributed representations to particular classes, and the application of over tting avoidance techniques such as decisiontree pruning. Finally,we show thatlike the other methodsthe errorcorrecting code technique can provide reliable class probability estimates. Taken together, these results demonstrate that errorcorrecting output codes provide a generalpurpose method for improving the performance of inductive learning programs on multiclass problems.
Comments on Broadcast Channels
, 1998
"... The key ideas in the theory of broadcast channels are illustrated by discussing some of the progress toward finding the capacity region. The capacity region is still unknown. Index TermsBinning, broadcast channel, capacity, degraded broadcast channel, feedback capacity, SlepianWolf, superposit ..."
Abstract

Cited by 566 (4 self)
 Add to MetaCart
The key ideas in the theory of broadcast channels are illustrated by discussing some of the progress toward finding the capacity region. The capacity region is still unknown. Index TermsBinning, broadcast channel, capacity, degraded broadcast channel, feedback capacity, SlepianWolf, superposition. I.
Results 1  10
of
26,104