Results 1 
6 of
6
Spatial Representation of Symbolic Sequences through Iterative Function Systems
 IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans
, 1998
"... Jeffrey proposed a graphic representation of DNA sequences using Barnsley's iterative function systems. In spite of further developments in this direction, the proposed graphic representation of DNA sequences has been lacking a rigorous connection between its spatial scaling characteristics and the ..."
Abstract

Cited by 23 (10 self)
 Add to MetaCart
Jeffrey proposed a graphic representation of DNA sequences using Barnsley's iterative function systems. In spite of further developments in this direction, the proposed graphic representation of DNA sequences has been lacking a rigorous connection between its spatial scaling characteristics and the statistical characteristics of the DNA sequences themselves. We 1) generalize Jeffrey's graphic representation to accommodate (possibly infinite) sequences over an arbitrary finite number of symbols, 2) establish a direct correspondence between the statistical characterization of symbolic sequences via R'enyi entropy spectra and the multifractal characteristics (R'enyi generalized dimensions) of the sequences' spatial representations, 3) show that for general symbolic dynamical systems, the multifractal f H  spectra in the sequence space coincide with the f H spectra on spatial sequence representations. Keywords Multifractal theory, Iterative function systems, Chaos game representation...
Fractal Encoding of Context Free Grammars in Connectionist Networks
 Expert Systems: The International Journal of Knowledge Engineering and Neural Networks
, 2000
"... : Connectionist network learning of context free languages has so far been applied only to very simple cases and has often made use of an external stack. Learning complex context free languages with a homogeneous neural mechanism looks like a much harder problem. The current paper takes a step towar ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
: Connectionist network learning of context free languages has so far been applied only to very simple cases and has often made use of an external stack. Learning complex context free languages with a homogeneous neural mechanism looks like a much harder problem. The current paper takes a step toward solving this problem by analyzing context free grammar computation (without addressing learning) in a class of analog computers called Dynamical Automata, which are naturally implemented in connectionist networks. The result is a widely applicable method of using fractal sets to organize infinite state computations in a bounded state space. An appealing consquence is the development of parameterspace maps, which locate various complex computers in spatial relationships to one another. An example suggests that such a global perspective on the organization of the parameter space may be helpful for solving the hard problem of getting connectionist networks to learn complex grammars from exam...
A Symbolic Dynamics Approach to Volatility Prediction
 in Computational Finance, (Proceedings of the Sixth International Conference on Computational Finance), Leonard N. Stern School of Business
, 1999
"... We consider the problem of predicting the direction of daily volatility changes in the Dow Jones Industrial Average (DJIA). This is accomplished by quantizing a series of historic volatility changes into a symbolic stream over 2 or 4 symbols. We compare predictive performance of the classical fixedo ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
We consider the problem of predicting the direction of daily volatility changes in the Dow Jones Industrial Average (DJIA). This is accomplished by quantizing a series of historic volatility changes into a symbolic stream over 2 or 4 symbols. We compare predictive performance of the classical fixedorder Markov models with that of a novel approach to variable memory length prediction (called prediction fractal machine, or PFM) which is able to select very specific deep prediction contexts (whenever there is a sufficient support for such contexts in the training data). We learn that daily volatility changes of the DJIA only exhibit rather shallow finite memory structure. On the other hand, a careful selection of quantization cut values can strongly enhance predictive power of symbolic schemes. Results on 12 nonoverlapping epochs of the DJIA strongly suggest that PFMs can outperform both traditional Markov models and (continuousvalued) GARCH models in the task of predicting volatility o...
Extracting Finite State Representations from Recurrent Neural Networks trained on Chaotic Symbolic Sequences
 IEEE Transactions on Neural Networks
, 1999
"... While much work has been done in neural based modeling of real valued chaotic time series, little effort has been devoted to address similar problems in the symbolic domain. We investigate the knowledge induction process associated with training recurrent neural networks (RNNs) on single long chaoti ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
While much work has been done in neural based modeling of real valued chaotic time series, little effort has been devoted to address similar problems in the symbolic domain. We investigate the knowledge induction process associated with training recurrent neural networks (RNNs) on single long chaotic symbolic sequences. Even though training RNNs to predict the next symbol leaves the standard performance measures such as the mean square error on the network output virtually unchanged, the networks nevertheless do extract a lot of knowledge. We monitor the knowledge extraction process by considering the networks stochastic sources and letting them generate sequences which are then confronted with the training sequence via information theoretic entropy and crossentropy measures. We also study the possibility of reformulating the knowledge gained by RNNs in a compact and easytoanalyze form of finite state stochastic machines. The experiments are performed on two sequences with different...
Dynamical Models of Sentence Processing
, 1999
"... We suggest that the theory of dynamical systems provides a revealing general framework for modeling the representations and mechanism underlying syntactic processing. We show how a particular dynamical model, the Visitation Set Gravitation (VSG) model of Tabor, Juliano, and Tanenhaus (Language and C ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We suggest that the theory of dynamical systems provides a revealing general framework for modeling the representations and mechanism underlying syntactic processing. We show how a particular dynamical model, the Visitation Set Gravitation (VSG) model of Tabor, Juliano, and Tanenhaus (Language and Cognitive Processes, 1997), develops syntactic representations and models a set of contingent frequency effects in parsing that are problematic for other models. We also present new simulations showing how the model accounts for semantic effects in parsing and propose a new account of the distinction between syntactic and semantic incongruity. The results show how symbolic structures useful in parsing arise as emergent properties of connectionist dynamical systems. 1 1. Introduction 1.1 The Dynamics of Sentence Processing Linguistic input is typically consistent with multiple syntactic possibilities as it unfolds over time. Because syntax strongly constrains interpretation, the processing ...
Anticipatory Behaviour based on Prediction of Image Sequences
"... Abstract. In our scenario an object moves in an environment with obstacles. We would like to provide a robot with anticipatory behaviour. The target is often not visible and the robot should learn to anticipate the location where the object will most likely reappear. The observations are coded as a ..."
Abstract
 Add to MetaCart
Abstract. In our scenario an object moves in an environment with obstacles. We would like to provide a robot with anticipatory behaviour. The target is often not visible and the robot should learn to anticipate the location where the object will most likely reappear. The observations are coded as a sequence of views and therefore no physical motion model is needed for the moving object. We apply Prediction Fractal Machines (PFM) as well as Variable Length Markov Models (VLMM) to predict the continuation of the sequence of views. We present results of simulations and real world experiments. 1 Introduction and general approach We are concerned with a solution for the following problem. A robot exists in a world with some obstacles (e.g. walls) and its task is to catch a moving object (e.g. a ball). In this paper, we start with a simpler scenario. The robot observes repeatedly a ball rolling behind a wall and reappearing at the other side. From