Results 1 
3 of
3
Predictability, Complexity, and Learning
, 2001
"... We define predictive information Ipred(T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T: Ipred(T) can remain finite, grow logarithmically, or grow as a fractional power law. If t ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
We define predictive information Ipred(T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T: Ipred(T) can remain finite, grow logarithmically, or grow as a fractional power law. If the time series allows us to learn a model with a finite number of parameters, then Ipred(T) grows logarithmically with a coefficient that counts the dimensionality of the model space. In contrast, powerlaw growth is associated, for example, with the learning of infinite parameter (or nonparametric) models such as continuous functions with smoothness constraints. There are connections between the predictive information and measures of complexity that have been defined both in learning theory and the analysis of physical systems through statistical mechanics and dynamical systems theory. Furthermore, in the same way that entropy provides the unique measure of available information consistent with some simple and plausible conditions, we argue that the divergent part of Ipred(T) provides the unique measure for the complexity of dynamics underlying a time series. Finally, we discuss how these ideas may be useful in problems in physics, statistics, and biology.
Fluctuationdissipation theorem and models of learning
 Neural Comput
, 2005
"... Advances in statistical learning theory have resulted in a multitude of different designs of learning machines. But which ones are implemented by brains and other biological information processors? We analyze how various abstract Bayesian learners perform on different data and argue that it is diffi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Advances in statistical learning theory have resulted in a multitude of different designs of learning machines. But which ones are implemented by brains and other biological information processors? We analyze how various abstract Bayesian learners perform on different data and argue that it is difficult to determine which learningâ€“ theoretic computation is performed by a particular organism using just its performance in learning a stationary target (learning curve). Basing on the fluctuationâ€“dissipation relation in statistical physics, we then discuss a different experimental setup that might be able to solve the problem. 1
THE ISING ANTIFERROMAGNET WITH QUENCHED DILUTION ON A TRIANGULAR LATTICE
, 2003
"... This investigation applies information theoretic techniques to study the ordering and structure of the Ising antiferromagnet with quenched disorder on a triangular lattice. The pure system shows no phase transition due to the high degree of frustration present. However, when quenched vacancies are i ..."
Abstract
 Add to MetaCart
This investigation applies information theoretic techniques to study the ordering and structure of the Ising antiferromagnet with quenched disorder on a triangular lattice. The pure system shows no phase transition due to the high degree of frustration present. However, when quenched vacancies are introduced randomly into this geometrically frustrated system, a phase transition at finite temperature, to an ordered phase, can arise. If the vacancies are introduced on all three sublattices of the triangular lattice, then a twodimensional spinglass transition occurs. If dilution takes place on only one sublattice, then the other two sublattices develop magnetizations below the critical temperature. These magnetizations are equal in magnitude but opposite in sign, producing a system that still exhibits no net magnetization. The diluted sublattice exhibits spinglass ordering, but no net magnetization. Thus, this model exhibits two generic features that merit detailed study: a twodimensional spinglass transition and "order arising from disorder", in this case theintroduction of randomly placed vacancies. To investigate these transitions and the ordering that occurs, we have used both traditional quantities from statistical mechanics, such as the sublattice magnetizations, the EdwardsAnderson order parameter,