## Constructing Finite-Context Sources From Fractal Representations of Symbolic Sequences (1998)

Citations: | 6 - 4 self |

### BibTeX

@MISC{Tino98constructingfinite-context,

author = {Peter Tino and Georg Dorffner},

title = {Constructing Finite-Context Sources From Fractal Representations of Symbolic Sequences},

year = {1998}

}

### OpenURL

### Abstract

We propose a novel approach to constructing predictive models on long complex symbolic sequences. The models are constructed by first transforming the training sequence n-block structure into a spatial structure of points in a unit hypercube. The transformation between the symbolic and Euclidean spaces embodies a natural smoothness assumption (n-blocks with long common suffices are likely to produce similar continuations) in that the longer is the common suffix shared by any two n-blocks, the closer lie their point representations. Finding a set of prediction contexts is then formulated as a resource allocation problem solved by vector quantizing the spatial representation of the training sequence n-block structure. Our predictive models are similar in spirit to variable memory length Markov models (VLMMs). We compare the proposed models with both the classical and variable memory length Markov models on two chaotic symbolic sequences with different levels of subsequence distribution ...