Results 1 
9 of
9
Data Compression
 ACM Computing Surveys
, 1987
"... This paper surveys a variety of data compression methods spanning almost forty years of research, from the work of Shannon, Fano and Huffman in the late 40's to a technique developed in 1986. The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing effectiv ..."
Abstract

Cited by 87 (3 self)
 Add to MetaCart
This paper surveys a variety of data compression methods spanning almost forty years of research, from the work of Shannon, Fano and Huffman in the late 40's to a technique developed in 1986. The aim of data compression is to reduce redundancy in stored or communicated data, thus increasing effective data density. Data compression has important application in the areas of file storage and distributed systems. Concepts from information theory, as they relate to the goals and evaluation of data compression methods, are discussed briefly. A framework for evaluation and comparison of methods is constructed and applied to the algorithms presented. Comparisons of both theoretical and empirical natures are reported and possibilities for future research are suggested. INTRODUCTION Data compression is often referred to as coding, where coding is a very general term encompassing any special representation of data which satisfies a given need. Information theory is defined to be the study of eff...
Code Density Optimization for Embedded DSP Processors Using Data Compression Techniques
 Proceedings of the 15th Conference on Advanced Research in VLSI
, 1995
"... We address the problem of code size minimization in VLSI systems with embedded DSP processors. Reducing code size reduces the production cost of embedded systems. We use data compression methods to develop code size minimization strategies. We present a framework for code size minimization where the ..."
Abstract

Cited by 58 (3 self)
 Add to MetaCart
We address the problem of code size minimization in VLSI systems with embedded DSP processors. Reducing code size reduces the production cost of embedded systems. We use data compression methods to develop code size minimization strategies. We present a framework for code size minimization where the compressed data consists of a dictionary and a skeleton. The dictionary can be computed using popular text compression algorithms. We describe two methods to execute the compressed code that have varying performance characteristics and varying degrees of freedom in compressing the code. Experimental results obtained with a TMS320C25 code generator are presented. 1: Introduction An increasingly common microarchitecture for embedded systems is to integrate a microprocessor or microcontroller, a ROM and an ASIC all on a single integrated circuit (Figure 1). Such a microarchitecture can currently be found in such diverse embedded systems as FAX modems, laser printers and cellular telephones....
A Relational Approach To Optimization Problems
, 1996
"... The main contribution of this thesis is a study of the dynamic programming and greedy strategies for solving combinatorial optimization problems. The study is carried out in the context of a calculus of relations, and generalises previous work by using a loop operator in the imperative programming s ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
The main contribution of this thesis is a study of the dynamic programming and greedy strategies for solving combinatorial optimization problems. The study is carried out in the context of a calculus of relations, and generalises previous work by using a loop operator in the imperative programming style for generating feasible solutions, rather than the fold and unfold operators of the functional programming style. The relationship between fold operators and loop operators is explored, and it is shown how to convert from the former to the latter. This fresh approach provides additional insights into the relationship between dynamic programming and greedy algorithms, and helps to unify previously distinct approaches to solving combinatorial optimization problems. Some of the solutions discovered are new and solve problems which had previously proved difficult. The material is illustrated with a selection of problems and solutions that is a mixture of old and new. Another contribution is the invention of a new calculus, called the graph calculus, which is a useful tool for reasoning in the relational calculus and other nonrelational calculi. The graph
A Specialized Branching and Fathoming Technique for The Longest Common Subsequence Problem
, 2006
"... Abstract⎯Given a set S = {S 1,..., S k} of finite strings, the klongest common subsequence problem (kLCSP) seeks a string L of maximum length such that L is a subsequence of each S i for i = 1,..., k. This paper presents a technique, specialized branching, that solves kLCSP. Specialized branching ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract⎯Given a set S = {S 1,..., S k} of finite strings, the klongest common subsequence problem (kLCSP) seeks a string L of maximum length such that L is a subsequence of each S i for i = 1,..., k. This paper presents a technique, specialized branching, that solves kLCSP. Specialized branching combines the benefits of both dynamic programming and branch and bound to reduce the search space. For large k, this method is shown to be computationally superior to dynamic programming. Keywords⎯Longest common subsequence, Branch and bound, Dynamic programming 1.
SASE: Implementation of a Compressed Text Search Engine
 In Usenix Symposium on Internet Technologies and Systems
, 1997
"... Keyword based search engines are the basic building block of text retrieval systems. Higher level systems like content sensitive search engines and knowledgebased systems still rely on keyword search as the underlying text retrieval mechanism. With the explosive growth in content, Internet and Intra ..."
Abstract
 Add to MetaCart
Keyword based search engines are the basic building block of text retrieval systems. Higher level systems like content sensitive search engines and knowledgebased systems still rely on keyword search as the underlying text retrieval mechanism. With the explosive growth in content, Internet and Intranet information repositories require efficient mechanisms to store as well as index data. In this paper we discuss the implementation of the Shrink and Search Engine (SASE) framework which unites text compression and indexing to maximize keyword search performance while reducing storage cost. SASE features the novel capability of being able to directly search through compressed text without explicit decompression. The implementation includes a search server architecture, which can be accessed from a Java frontend to perform keyword search on the Internet. The performance results show that the compression efficiency of SASE is within 717% of GZIP one of the best lossless compression scheme...
Code Generation and Optimization for Embedded Digital Signal Processors
, 1996
"... The advent of deep submicron processing technology has made it possible and desirable to integrate a processor core, a program ROM, and applicationspecific circuitry all on a single IC. As the complexity of embedded software grows, highlevel languages such as C and C++ are increasingly employed in ..."
Abstract
 Add to MetaCart
The advent of deep submicron processing technology has made it possible and desirable to integrate a processor core, a program ROM, and applicationspecific circuitry all on a single IC. As the complexity of embedded software grows, highlevel languages such as C and C++ are increasingly employed in writing embedded software. Consequently, highlevel language compilers have become an essential tool in the development of embedded systems. Fixedpoint digital signal processors are among the most commonly embedded cores, due to their favorable performancecost characteristics. However, these architectures are usually designed and optimized for their application domain, and pose challenges for compiler technology. Traditional compiler optimizations, though necessary, are insufficient for generating efficient and compact code. Therefore, new optimizations are required to produce code of the highest quality in a reasonable amount of time. In this thesis the author presents techniques for co...
DictionarySymbolwise Flexible Parsing
, 2013
"... Abstract. Linear time optimal parsing algorithms are very rare in the dictionary based branch of the data compression theory. The most recent is the Flexible Parsing algorithm of Mathias and Shainalp that works when the dictionary is prefix closed and the encoding of dictionary pointers has a consta ..."
Abstract
 Add to MetaCart
Abstract. Linear time optimal parsing algorithms are very rare in the dictionary based branch of the data compression theory. The most recent is the Flexible Parsing algorithm of Mathias and Shainalp that works when the dictionary is prefix closed and the encoding of dictionary pointers has a constant cost. We present the DictionarySymbolwise Flexible Parsing algorithm that is optimal for prefixclosed dictionaries and any symbolwise compressor under some natural hypothesis. In the case of LZ78alike algorithms with variable costs and any, linear as usual, symbolwise compressor it can be implemented in linear time. In the case of LZ77alike dictionaries and any symbolwise compressor it can be implemented in O(n log(n)) time. We further present some experimental results that show the effectiveness of the dictionarysymbolwise approach. 1