Results 1  10
of
18
Periods in strings
 Journal of Combinatorial Theory, Series A
, 1981
"... A survey is presented of some methods and results on counting words that satisfy various restrictions on subwords (i.e., blocks of consecutive symbols). Various applications to commafree codes, games, pattern matching, and other subjects are indicated. The emphasis is on the unified treatment of th ..."
Abstract

Cited by 78 (0 self)
 Add to MetaCart
A survey is presented of some methods and results on counting words that satisfy various restrictions on subwords (i.e., blocks of consecutive symbols). Various applications to commafree codes, games, pattern matching, and other subjects are indicated. The emphasis is on the unified treatment of those topics through the use of generating functions. 1.
Robust Universal Complete Codes for Transmission and Compression
 Discrete Applied Mathematics
, 1996
"... Several measures are defined and investigated, which allow the comparison of codes as to their robustness against errors. Then new universal and complete sequences of variablelength codewords are proposed, based on representing the integers in a binary Fibonacci numeration system. Each sequence is ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Several measures are defined and investigated, which allow the comparison of codes as to their robustness against errors. Then new universal and complete sequences of variablelength codewords are proposed, based on representing the integers in a binary Fibonacci numeration system. Each sequence is constant and need not be generated for every probability distribution. These codes can be used as alternatives to Huffman codes when the optimal compression of the latter is not required, and simplicity, faster processing and robustness are preferred. The codes are compared on several "reallife" examples. 1. Motivation and Introduction Let A = fA 1 ; A 2 ; \Delta \Delta \Delta ; An g be a finite set of elements, called cleartext elements, to be encoded by a static uniquely decipherable (UD) code. For notational ease, we use the term `code' as abbreviation for `set of codewords'; the corresponding encoding and decoding algorithms are always either given or clear from the context. A code i...
On the Construction of Statistically Synchronizable Codes
, 1992
"... We consider the problem of constructing statistically synchronizable codes over arbitrary alphabets and for any finite source. We show how to efficiently construct a statistically synchronizable code whose average codeword length is whithin the least likely codeword probability from that of the H ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
We consider the problem of constructing statistically synchronizable codes over arbitrary alphabets and for any finite source. We show how to efficiently construct a statistically synchronizable code whose average codeword length is whithin the least likely codeword probability from that of the Huffman code for the same source. Moreover, we give a method for constructing codes having a synchronizing codeword. The codes we construct present high synchronizing capability and low redundancy. 3 Part of this work was done while visiting IBM T. J. Watson Research Center, P.O. Box 218, Yorktown Heights, New York, 10598. This work was partially supported by the Italian Ministry of the University and Scientific Research, within the framework of the Project: Progetto ed Analisi di Algoritmi. Part of this work has been presented at the 1990 IEEE International Symposium on Information Theory, San Diego, CA, Jan. 1990. 1 1 Introduction A basic problem in information transmission is to mai...
Bidirectional Huffman Coding
, 1989
"... Under what conditions can Huffman codes be efficiently decoded in both directions? The usual decoding procedure works also for backward decoding only if the code has the affix property, i.e., both prefix and suffix properties. Some affix Huffman codes are exhibited, and necessary conditions for the ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Under what conditions can Huffman codes be efficiently decoded in both directions? The usual decoding procedure works also for backward decoding only if the code has the affix property, i.e., both prefix and suffix properties. Some affix Huffman codes are exhibited, and necessary conditions for the existence of such codes are given. An algorithm is presented which, for a given set of codeword lengths, constructs an affix code, if there exists one. Since for many distributions there is no affix code giving the same compression as the Huffman code, a new algorithm for backward decoding of nonaffix Huffman codes is presented, and its worst case complexity is proved to be linear in the length of the encoded text. 1. Introduction For a given sequence of n weights w 1 ; : : : ; wn , with w i ? 0, Huffman's wellknown algorithm [9] constructs an optimum prefix code. We use throughout the term `code' as abbreviation for `set of codewords'. In a prefix code no codeword is the prefix of any o...
Combinatorial Construction of High Rate Runlengthlimited Codes
 IEEE GLOBAL TELECOMMUN. CONF. GLOBECOM’96
, 1996
"... New combinatorial construction techniques are proposed which convert binary user information into a (0, k) constrained sequence having the virtue that at most k 'zeroes' between logical 'ones' will occur. In this way sequences are constructed which have a limited runlength. These codes find applicat ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
New combinatorial construction techniques are proposed which convert binary user information into a (0, k) constrained sequence having the virtue that at most k 'zeroes' between logical 'ones' will occur. In this way sequences are constructed which have a limited runlength. These codes find application in optical and magnetic recording systems. The new construction methods provide efficient, high rate codes with a low complexity. The low complex combinatorial structure of the encoder and the decoder ensure a very fast and efficient parallel conversion of binary information to code words and vice versa. Specifically, we present the combinatorial structures to convert 16 data bits into a 17 bit constrained sequence to obtain an optimum (0,4) code, a (0,6) code with at most one byte error propagation, and a (0,6/6) code, respectively. Serious error propagation is avoided by using constrained codes with several unconstrained positions, which are reserved to store the parity bits of an error control code which protects the constrained code word.
Almost All Complete Binary Prefix Codes Have a SelfSynchronizing String
, 2002
"... The probability that a complete binary prefix code has a selfsynchronizing string approaches one, as the number of codewords tends to infinity. ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The probability that a complete binary prefix code has a selfsynchronizing string approaches one, as the number of codewords tends to infinity.
Combinatorial problems motivated by commafree codes
 J. Combin. Des
, 2004
"... Abstract In the paper some combinatorial problems motivated by commafree codes are considered. We describe these problems, give the most significant known results and methods used, present some new results and formulate open problems. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract In the paper some combinatorial problems motivated by commafree codes are considered. We describe these problems, give the most significant known results and methods used, present some new results and formulate open problems.
Data Synchronization with Timing
 IEEE TRANS. INFORM. THEORY
, 1999
"... This paper proposes and analyzes data synchronization techniques that not only resynchronize after encoded bits are corrupted by insertion, deletion or substitution errors, but also produce estimates of the time indices of the decoded data symbols, in order to determine their positions in the origin ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
This paper proposes and analyzes data synchronization techniques that not only resynchronize after encoded bits are corrupted by insertion, deletion or substitution errors, but also produce estimates of the time indices of the decoded data symbols, in order to determine their positions in the original source sequence. The techniques are based on block codes, and the estimates are of the time indices modulo some integer T , called the timing span, which is desired to be large. Several types of block codes that encode binary data are analyzed on the basis of the maximum attainable timing span for a given coding rate R (or, equivalently, redundancy ae = 1 \Gamma R) and permissible resynchronization delay D. It is found that relatively simple codes can asymptotically attain the maximum timing span among such block codes, which grows exponentially with delay, with exponent D(1 \Gamma R) + o(D). Thus large timing span can be attained with little redundancy and only moderate values of delay.
Codes for Data Synchronization and Timing
 in Proceedings of the 1999 IEEE Information Theory and Communications Workshop. IEEE
, 1999
"... This paper proposes and analyzes data synchronization techniques that not only resynchronize after encoded bits are corrupted by insertions, deletions or substitution errors, but also produce estimates of the time indices of the decoded data. I. INTRODUCTION Synchronization is an important aspect ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This paper proposes and analyzes data synchronization techniques that not only resynchronize after encoded bits are corrupted by insertions, deletions or substitution errors, but also produce estimates of the time indices of the decoded data. I. INTRODUCTION Synchronization is an important aspect of any data compression system that operates in an environment where decoding may begin at an arbitrary point in the compressed stream of bits, or where the compressed bits may be corrupted by insertions, deletions or substitution 2 errors. Though synchronization methods have been much developed (cf. [18]), there is no theory for the design and analysis of codes that, in addition to permitting the decoder to synchronize with the encoder, enable it to produce estimates of the time indices of the data it decodes. This paper describes an approach to the development of such a theory. As a motivating example, suppose the following infinite sequence of temperatures, temp's: 43 64 27 54 36 42 ...
Codes for Data Synchronization with Timing
 DCC '99
, 1999
"... This paper investigates the design and analysis of data synchronization codes whose decoders have the property that, in addition to reestablishing correct decoding after encoded data is lost or afflicted with errors, they produce the original time index of each decoded data symbol modulo some int ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper investigates the design and analysis of data synchronization codes whose decoders have the property that, in addition to reestablishing correct decoding after encoded data is lost or afflicted with errors, they produce the original time index of each decoded data symbol modulo some integer T . The motivation for such data synchronization with timing is that in many situations where data must be encoded, it is not sufficient for the decoder to present a sequence of correct data symbols. Instead, the user also needs to know the position in the original source sequence of the symbols being presented. With this goal in mind, periodic prefixsynchronized (PPS) codes are introduced and analyzed on the basis of their synchronization delay D, rate R, and timing span T . Introduced are two specific PPS designs called natural marker and cascaded codes. A principal result is that when coding binary data with rate R, the largest possible timing span attainable with PPS codes...