Results 1  10
of
43
Guessing the Composer’s Mind: Applying Universal Prediction to Musical Style
 PROCEEDINGS ICMC 99, BEIJING, CHINA
, 1999
"... In this paper, we present a dictionary based universal prediction algorithm that provides a very general and flexible approach to machine learning in the domain of musical style. Such operations as improvisation or assistance to composition can be realized on the resulting representations. ..."
Abstract

Cited by 39 (9 self)
 Add to MetaCart
In this paper, we present a dictionary based universal prediction algorithm that provides a very general and flexible approach to machine learning in the domain of musical style. Such operations as improvisation or assistance to composition can be realized on the resulting representations.
High Performance Compression of Visual Information  A Tutorial Review  Part I: Still Pictures
, 1999
"... Digital images have become an important source of information in the modern world of communication systems. In their raw form, digital images require a tremendous amount of memory. Many research efforts have been devoted to the problem of image compression in the last two decades. Two different comp ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
(Show Context)
Digital images have become an important source of information in the modern world of communication systems. In their raw form, digital images require a tremendous amount of memory. Many research efforts have been devoted to the problem of image compression in the last two decades. Two different compression categories must be distinguished: lossless and lossy. Lossless compression is achieved if no distortion is introduced in the coded image. Applications requiring this type of compression include medical imaging and satellite photography. For applications such as videotelephony or multimedia applications some loss of information is usually tolerated in exchange for a high compression ratio.
Blockoriented compression techniques for large statistical databases
 IEEE Transactions on Knowledge and Data Engineering
, 1997
"... Abstract—Disk I/O has long been a performance bottleneck for very large databases. Database compression can be used to reduce disk I/O bandwidth requirements for large data transfers. In this paper, we explore the compression of large statistical databases and propose techniques for organizing the c ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
Abstract—Disk I/O has long been a performance bottleneck for very large databases. Database compression can be used to reduce disk I/O bandwidth requirements for large data transfers. In this paper, we explore the compression of large statistical databases and propose techniques for organizing the compressed data such that standard database operations such as retrievals, inserts, deletes and modifications are supported. We examine the applicability and performance of three methods. Two of these are adaptations of existing methods, but the third, called Tuple Differential Coding (TDC) [16], is a new method that allows conventional access mechanisms to be used with the compressed data to provide efficient access. We demonstrate how the performance of queries that involve large data transfers can be improved with these database compression techniques. Index Terms—Database compression, data compression, physical organization, statistical database. 1
MIRAGE+: A Kernel Implementation of Distributed Shared Memory on a Network of Personal Computers
 Software Practice & Experience
, 1994
"... This paper addresses the architectural dependencies in the design of the system and evaluates performance of the implementation. The new version, MIRAGE + , performs well compared to Mirage even though eight times the amount of data is sent on each page fault because of the larger page size used in ..."
Abstract

Cited by 19 (9 self)
 Add to MetaCart
This paper addresses the architectural dependencies in the design of the system and evaluates performance of the implementation. The new version, MIRAGE + , performs well compared to Mirage even though eight times the amount of data is sent on each page fault because of the larger page size used in the implementation. We show that performance of systems with a large page size to network packet size can be dramatically improved on conventional hardware by applying three wellknown techniques: packet blasting, compression, and running at interrupt level
OnLine Stochastic Processes in Data Compression
, 1996
"... The ability to predict the future based upon the past in finitealphabet sequences has many applications, including communications, data security, pattern recognition, and natural language processing. By Shannon's theory and the breakthrough development of arithmetic coding, any sequence, a 1 ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
The ability to predict the future based upon the past in finitealphabet sequences has many applications, including communications, data security, pattern recognition, and natural language processing. By Shannon's theory and the breakthrough development of arithmetic coding, any sequence, a 1 a 2 \Delta \Delta \Delta a n , can be encoded in a number of bits that is essentially equal to the minimal informationlossless codelength, P i \Gamma log 2 p(a i ja 1 \Delta \Delta \Delta a i\Gamma1 ). The goal of universal online modeling, and therefore of universal data compression, is to deduce the model of the input sequence a 1 a 2 \Delta \Delta \Delta a n that can estimate each p(a i ja 1 \Delta \Delta \Delta a i\Gamma1 ) knowing only a 1 a 2 \Delta \Delta \Delta a i\Gamma1 so that the ex...
Robust Universal Complete Codes for Transmission and Compression
 Discrete Applied Mathematics
, 1996
"... Several measures are defined and investigated, which allow the comparison of codes as to their robustness against errors. Then new universal and complete sequences of variablelength codewords are proposed, based on representing the integers in a binary Fibonacci numeration system. Each sequence is ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
(Show Context)
Several measures are defined and investigated, which allow the comparison of codes as to their robustness against errors. Then new universal and complete sequences of variablelength codewords are proposed, based on representing the integers in a binary Fibonacci numeration system. Each sequence is constant and need not be generated for every probability distribution. These codes can be used as alternatives to Huffman codes when the optimal compression of the latter is not required, and simplicity, faster processing and robustness are preferred. The codes are compared on several "reallife" examples. 1. Motivation and Introduction Let A = fA 1 ; A 2 ; \Delta \Delta \Delta ; An g be a finite set of elements, called cleartext elements, to be encoded by a static uniquely decipherable (UD) code. For notational ease, we use the term `code' as abbreviation for `set of codewords'; the corresponding encoding and decoding algorithms are always either given or clear from the context. A code i...
A context based adaptive arithmetic coding technique for lossless image compression
 IEEE Signal Processing Letters
, 1999
"... Abstract—Significant progress has recently been made in lossless image compression using discrete wavelet transforms. The overall performance of these schemes may be further improved by properly designing efficient entropy coders. In this letter, a new technique is introduced for the implementation ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Significant progress has recently been made in lossless image compression using discrete wavelet transforms. The overall performance of these schemes may be further improved by properly designing efficient entropy coders. In this letter, a new technique is introduced for the implementation of contextbased adaptive arithmetic entropy coding. This technique is based on the prediction of the value of the current transform coefficient, using a weighted least squares method, in order to achieve appropriate context selection for arithmetic coding. Experimental results illustrate and evaluate the performance of the proposed technique. Index Terms—Image coding, lossless coding.
Document Image Compression and Analysis
 PhD of the university of Maryland
, 1997
"... Image compression usually considers the minimization of storage space as its main objective. It is desirable, however, to code images so that we have the ability to process the resulting representation directly. In this thesis we explore an approach to document image compression that is efficient in ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Image compression usually considers the minimization of storage space as its main objective. It is desirable, however, to code images so that we have the ability to process the resulting representation directly. In this thesis we explore an approach to document image compression that is efficient in both space (storage requirement) and time (processing flexibility). A representation is presented in which componentlevel redundancy is removed by forming a prototype library and component location table. This representation forms a basis for compression and provides direct access to image components. To generate the prototype library, a new clustering approach is developed which is suitable for document image components. The distance metric is based on a character degradation model so that degraded versions of the same character will be grouped together. To achieve a lossless representation when required, the residuals are encoded efficiently using a structural distance ordering. OCR is...
Document Filtering as an Adaptive and Temporallydependent Process
, 2001
"... The ltering task has traditionally been dened as a special case of the information retrieval task, and undeniably, it can be performed by applying retrieval techniques. This theoretical study summarizes our experiences in viewing ltering as an adaptive and temporallydependent process. A process t ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
The ltering task has traditionally been dened as a special case of the information retrieval task, and undeniably, it can be performed by applying retrieval techniques. This theoretical study summarizes our experiences in viewing ltering as an adaptive and temporallydependent process. A process that, in contrast to traditional retrieval, takes into account the dynamic nature of relevance and its temporal aspects. We investigate the nature of user interests, formulate useful types of adaptivity, and discuss the eectiveness of those types in relation to user interests. To deal with drifts, we introduce the notion of the half life of documents. Furthermore, we discuss potential dangers for eectiveness such as selectivity traps. We pay special attention to practical eciency issues by discussing term selection and incrementality. 1 Introduction The digital and networking revolution over the last decade has made large amounts of digital information available. This tremendous ...
Language Acquisition and Data Compression
 Australasian Natural Language Processing Summer Workshop
, 1997
"... Statistical data compression requires a stochastic language model which must rapidly adapt to new data as it is encountered. A grammatical inference engine is introduced which satisfies this requirement; it is able to discover structure in arbitrary data using nothing more than the predictions of a ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Statistical data compression requires a stochastic language model which must rapidly adapt to new data as it is encountered. A grammatical inference engine is introduced which satisfies this requirement; it is able to discover structure in arbitrary data using nothing more than the predictions of a simple trigram model. We show that compression may be used as an alternative to perplexity for language model evaluation, and that the information processing techniques employed by our system may reflect what happens in the human brain. 1 Introduction Grammatical inference is the process of programming a computer to automatically infer a grammar for a language [8]. We consider a grammar to be nothing more than a model for some data. Applications such as speech recognition and data compression require a stochastic language model, and welldefined performance measures exist for such models. It is easy to get caught in the trap of building complicated models which utilise various ad hoc techni...