Results 1  10
of
16
Data Compression
 ACM Computing Surveys
, 1987
"... This paper surveys a variety of data compression methods spanning almost forty years of research, from the work of Shannon, Fano and Huffman in the late 40's to a technique developed in 1986. The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing eff ..."
Abstract

Cited by 84 (3 self)
 Add to MetaCart
This paper surveys a variety of data compression methods spanning almost forty years of research, from the work of Shannon, Fano and Huffman in the late 40's to a technique developed in 1986. The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing effective data density. Data compression has important application in the areas of file storage and distributed systems. Concepts from information theory, as they relate to the goals and evaluation of data compression methods, are discussed briefly. A framework for evaluation and comparison of methods is constructed and applied to the algorithms presented. Comparisons of both theoretical and empirical natures are reported and possibilities for future research are suggested. INTRODUCTION Data compression is often referred to as coding, where coding is a very general term encompassing any special representation of data which satisfies a given need. Information theory is defined to be the study of eff...
Code density optimization for embedded DSP processors using data compression techniques
 Proc. Chapel Hill Conf. Adv. Res.VLSI
, 1995
"... ..."
A Relational Approach To Optimization Problems
, 1996
"... The main contribution of this thesis is a study of the dynamic programming and greedy strategies for solving combinatorial optimization problems. The study is carried out in the context of a calculus of relations, and generalises previous work by using a loop operator in the imperative programming s ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
The main contribution of this thesis is a study of the dynamic programming and greedy strategies for solving combinatorial optimization problems. The study is carried out in the context of a calculus of relations, and generalises previous work by using a loop operator in the imperative programming style for generating feasible solutions, rather than the fold and unfold operators of the functional programming style. The relationship between fold operators and loop operators is explored, and it is shown how to convert from the former to the latter. This fresh approach provides additional insights into the relationship between dynamic programming and greedy algorithms, and helps to unify previously distinct approaches to solving combinatorial optimization problems. Some of the solutions discovered are new and solve problems which had previously proved difficult. The material is illustrated with a selection of problems and solutions that is a mixture of old and new. Another contribution is the invention of a new calculus, called the graph calculus, which is a useful tool for reasoning in the relational calculus and other nonrelational calculi. The graph
Parsing with Suffix and Prefix Dictionaries
 IN IEEE DATA COMPRESSION CONFERENCE
, 1996
"... We show that greedy lefttoright (righttoleft) parsing is optimal w.r.t. a suffix (prefix) dictionary. To exploit this observation, we show how to construct a static suffix dictionary that supports online, lineartime optimal parsing. From this we derive an adaptive online method that yields ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We show that greedy lefttoright (righttoleft) parsing is optimal w.r.t. a suffix (prefix) dictionary. To exploit this observation, we show how to construct a static suffix dictionary that supports online, lineartime optimal parsing. From this we derive an adaptive online method that yields compression comparing favorably to LZW.
A Specialized Branching and Fathoming Technique for The Longest Common Subsequence Problem
, 2006
"... Abstract⎯Given a set S = {S 1,..., S k} of finite strings, the klongest common subsequence problem (kLCSP) seeks a string L of maximum length such that L is a subsequence of each S i for i = 1,..., k. This paper presents a technique, specialized branching, that solves kLCSP. Specialized branching ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract⎯Given a set S = {S 1,..., S k} of finite strings, the klongest common subsequence problem (kLCSP) seeks a string L of maximum length such that L is a subsequence of each S i for i = 1,..., k. This paper presents a technique, specialized branching, that solves kLCSP. Specialized branching combines the benefits of both dynamic programming and branch and bound to reduce the search space. For large k, this method is shown to be computationally superior to dynamic programming. Keywords⎯Longest common subsequence, Branch and bound, Dynamic programming 1.
DictionarySymbolwise Flexible Parsing
, 2013
"... Abstract. Linear time optimal parsing algorithms are very rare in the dictionary based branch of the data compression theory. The most recent is the Flexible Parsing algorithm of Mathias and Shainalp that works when the dictionary is prefix closed and the encoding of dictionary pointers has a consta ..."
Abstract
 Add to MetaCart
Abstract. Linear time optimal parsing algorithms are very rare in the dictionary based branch of the data compression theory. The most recent is the Flexible Parsing algorithm of Mathias and Shainalp that works when the dictionary is prefix closed and the encoding of dictionary pointers has a constant cost. We present the DictionarySymbolwise Flexible Parsing algorithm that is optimal for prefixclosed dictionaries and any symbolwise compressor under some natural hypothesis. In the case of LZ78alike algorithms with variable costs and any, linear as usual, symbolwise compressor it can be implemented in linear time. In the case of LZ77alike dictionaries and any symbolwise compressor it can be implemented in O(n log(n)) time. We further present some experimental results that show the effectiveness of the dictionarysymbolwise approach. 1
A Large Neighborhood Search Heuristic for the Longest Common Subsequence Problem
"... Abstract: Given a set S={S 1,...,S k} of finite strings, the kLongest Common Subsequence Problem (kLCSP) seeks a string L * of maximum length such that L * is a subsequence of each S i for i=1,...,k. This paper presents a large neighborhood search technique that provides quality solutions to large ..."
Abstract
 Add to MetaCart
Abstract: Given a set S={S 1,...,S k} of finite strings, the kLongest Common Subsequence Problem (kLCSP) seeks a string L * of maximum length such that L * is a subsequence of each S i for i=1,...,k. This paper presents a large neighborhood search technique that provides quality solutions to large kLCSP instances. This heuristic runs in linear time in both the length of the sequences and the number of sequences. Some computational results are provided.