Results 1 
9 of
9
Fixed point theorems on partial randomness
, 2009
"... In our former work [K. Tadaki, Local Proceedings of CiE 2008, pp. 425–434, 2008], we developed a statistical mechanical interpretation of algorithmic information theory by introducing the notion of thermodynamic quantities at temperature T, such as free energy F (T), energy E(T), and statistical m ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
In our former work [K. Tadaki, Local Proceedings of CiE 2008, pp. 425–434, 2008], we developed a statistical mechanical interpretation of algorithmic information theory by introducing the notion of thermodynamic quantities at temperature T, such as free energy F (T), energy E(T), and statistical mechanical entropy S(T), into the theory. These quantities are real functions of real argument T> 0. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by programsize complexity. Furthermore, we showed that this situation holds for the temperature itself as a thermodynamic quantity. Namely, the computability of the value of partition function Z(T) gives a sufficient condition for T ∈ (0, 1) to be a fixed point on partial randomness. In this paper, we show that the computability of each of all the thermodynamic quantities above gives the sufficient condition also. Moreover, we show that the computability of F (T) gives completely different fixed points from the computability of Z(T).
The Tsallis entropy and the Shannon entropy of a universal probability
 Proceedings of ISIT 2008. IEEE International Symposium on Information Theory, 2008
"... Abstract — We study the properties of Tsallis entropy and Shannon entropy from the point of view of algorithmic randomness. In algorithmic information theory, there are two equivalent ways to define the programsize complexity K(s) of a given finite binary string s. In the standard way, K(s) is defi ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract — We study the properties of Tsallis entropy and Shannon entropy from the point of view of algorithmic randomness. In algorithmic information theory, there are two equivalent ways to define the programsize complexity K(s) of a given finite binary string s. In the standard way, K(s) is defined as the length of the shortest input string for the universal selfdelimiting Turing machine to output s. In the other way, the socalled universal probability m is introduced first, and then K(s) is defined as − log 2 m(s) without reference to the concept of programsize. In this paper, we investigate the properties of the Shannon entropy, the power sum, and the Tsallis entropy of a universal probability by means of the notion of programsize complexity. We determine the convergence or divergence of each of these three quantities, and evaluate its degree of randomness if it converges. I.
Centre for Discrete Mathematics and Theoretical Computer ScienceFixed point theorems on partial randomness ∗
, 2009
"... Abstract. In our former work [K. Tadaki, Local Proceedings of CiE 2008, pp. 425–434, 2008], we developed a statistical mechanical interpretation of algorithmic information theory by introducing the notion of thermodynamic quantities at temperature T, such as free energy F (T), energy E(T), and stati ..."
Abstract
 Add to MetaCart
Abstract. In our former work [K. Tadaki, Local Proceedings of CiE 2008, pp. 425–434, 2008], we developed a statistical mechanical interpretation of algorithmic information theory by introducing the notion of thermodynamic quantities at temperature T, such as free energy F (T), energy E(T), and statistical mechanical entropy S(T), into the theory. These quantities are real functions of real argument T> 0. We then discovered that, in the interpretation, the temperature T equals to the partial randomness of the values of all these thermodynamic quantities, where the notion of partial randomness is a stronger representation of the compression rate by programsize complexity. Furthermore, we showed that this situation holds for the temperature itself as a thermodynamic quantity. Namely, the computability of the value of partition function Z(T) gives a sufficient condition for T ∈ (0, 1) to be a fixed point on partial randomness. In this paper, we show that the computability of each of all the thermodynamic quantities above gives the sufficient condition also. Moreover, we show that the computability of F (T) gives completely different fixed points from the computability of Z(T). Key words: algorithmic randomness, fixed point theorem, partial randomness, Chaitin Ω number, algorithmic information theory, thermodynamic quantities AMS subject classifications (2000) 68Q30, 26E40, 03D80, 82B30, 82B03 1
Algorithmic Thermodynamics DRAFT VERSION
, 2010
"... Algorithmic entropy can be seen as a special case of entropy as studied in statistical mechanics. This viewpoint allows us to apply many techniques developed for use in thermodynamics to the subject of algorithmic information theory. In particular, suppose we fix a universal prefixfree Turing machi ..."
Abstract
 Add to MetaCart
Algorithmic entropy can be seen as a special case of entropy as studied in statistical mechanics. This viewpoint allows us to apply many techniques developed for use in thermodynamics to the subject of algorithmic information theory. In particular, suppose we fix a universal prefixfree Turing machine and let X be the set of programs that halt for this machine. Then we can regard X as a set of ‘microstates’, and treat any function on X as an ‘observable’. For any collection of observables, we can study the Gibbs ensemble that maximizes entropy subject to constraints on expected values of these observables. We illustrate this by taking the log runtime, length, and output of a program as observables analogous to the energy E, volume V and number of molecules N in a container of gas. The conjugate variables of these observables allow us to define quantities which we call the ‘algorithmic temperature ’ T, ‘algorithmic pressure ’ P and ‘algorithmic potential ’ µ, since they are analogous to the temperature, pressure and chemical potential. We derive an analogue of the fundamental thermodynamic relation dE = TdS − PdV + µdN, and use it to study thermodynamic cycles analogous to those for heat engines. We also investigate the values of T, P and µ for which the partition function converges. At some points on the boundary of this domain of convergence, the partition function becomes uncomputable. Indeed, at these points the partition function itself has nontrivial algorithmic entropy.
Partial Randomness and Dimension of Recursively Enumerable Reals
, 906
"... Abstract. A real α is called recursively enumerable (“r.e. ” for short) if there exists a computable, increasing sequence of rationals which converges to α. It is known that the randomness of an r.e. real α can be characterized in various ways using each of the notions; programsize complexity, Mart ..."
Abstract
 Add to MetaCart
Abstract. A real α is called recursively enumerable (“r.e. ” for short) if there exists a computable, increasing sequence of rationals which converges to α. It is known that the randomness of an r.e. real α can be characterized in various ways using each of the notions; programsize complexity, MartinLöf test, Chaitin Ω number, the domination and Ωlikeness of α, the universality of a computable, increasing sequence of rationals which converges to α, and universal probability. In this paper, we generalize these characterizations of randomness over the notion of partial randomness by parameterizing each of the notions above by a real T ∈ (0, 1], where the notion of partial randomness is a stronger representation of the compression rate by means of programsize complexity. As a result, we present ten equivalent characterizations of the partial randomness of an r.e. real. The resultant characterizations of partial randomness are powerful and have many important applications. One of them is to present equivalent characterizations of the dimension of an individual r.e. real. The equivalence between the notion of Hausdorff dimension and compression rate by programsize complexity (or partial randomness) has been established at present by a series of works of many researchers over the last two decades. We present ten equivalent characterizations of the dimension of an individual r.e. real. Key words: algorithmic randomness, recursively enumerable real, partial
Algorithmic Thermodynamics
, 2010
"... Algorithmic entropy can be seen as a special case of entropy as studied in statistical mechanics. This viewpoint allows us to apply many techniques developed for use in thermodynamics to the subject of algorithmic information theory. In particular, suppose we fix a universal prefixfree Turing machi ..."
Abstract
 Add to MetaCart
Algorithmic entropy can be seen as a special case of entropy as studied in statistical mechanics. This viewpoint allows us to apply many techniques developed for use in thermodynamics to the subject of algorithmic information theory. In particular, suppose we fix a universal prefixfree Turing machine and let X be the set of programs that halt for this machine. Then we can regard X as a set of ‘microstates’, and treat any function on X as an ‘observable’. For any collection of observables, we can study the Gibbs ensemble that maximizes entropy subject to constraints on expected values of these observables. We illustrate this by taking the log runtime, length, and output of a program as observables analogous to the energy E, volume V and number of molecules N in a container of gas. The conjugate variables of these observables allow us to define quantities which we call the ‘algorithmic temperature ’ T, ‘algorithmic pressure ’ P and ‘algorithmic potential ’ µ, since they are analogous to the temperature, pressure and chemical potential. We derive an analogue of the fundamental thermodynamic relation dE = TdS − PdV + µdN, and use it to study thermodynamic cycles analogous to those for heat engines. We also investigate the values of T, P and µ for which the partition function converges. At some points on the boundary of this domain of convergence, the partition function becomes uncomputable. Indeed, at these points the partition function itself has nontrivial algorithmic entropy.
Equivalent characterizations of partial randomness for a recursively enumerable real
, 805
"... Abstract. A real number α is called recursively enumerable if there exists a computable, increasing sequence of rational numbers which converges to α. The randomness of a recursively enumerable real α can be characterized in various ways using each of the notions; programsize complexity, MartinLöf ..."
Abstract
 Add to MetaCart
Abstract. A real number α is called recursively enumerable if there exists a computable, increasing sequence of rational numbers which converges to α. The randomness of a recursively enumerable real α can be characterized in various ways using each of the notions; programsize complexity, MartinLöf test, Chaitin’s Ω number, the domination and Ωlikeness of α, the universality of a computable, increasing sequence of rational numbers which converges to α, and universal probability. In this paper, we generalize these characterizations of randomness over the notion of partial randomness by parameterizing each of the notions above by a real number T ∈ (0,1]. We thus present several equivalent characterizations of partial randomness for a recursively enumerable real number. Key words: algorithmic randomness, recursively enumerable real number, partial randomness, Chaitin’s Ω number, programsize complexity, universal probability 1
unknown title
, 904
"... A statistical mechanical interpretation of algorithmic information theory III: Composite systems and fixed points ..."
Abstract
 Add to MetaCart
A statistical mechanical interpretation of algorithmic information theory III: Composite systems and fixed points
unknown title
, 904
"... A statistical mechanical interpretation of algorithmic information theory III: Composite systems and fixed points ..."
Abstract
 Add to MetaCart
A statistical mechanical interpretation of algorithmic information theory III: Composite systems and fixed points