Results 1 
5 of
5
An Algorithmic Complexity Interpretation of Lin’s Third Law of Information Theory
, 2008
"... Abstract: Instead of static entropy we assert that the Kolmogorov complexity of a static structure such as a solid is the proper measure of disorder (or chaoticity). A static structure in a surrounding perfectlyrandom universe acts as an interfering entity which introduces local disruption in rando ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract: Instead of static entropy we assert that the Kolmogorov complexity of a static structure such as a solid is the proper measure of disorder (or chaoticity). A static structure in a surrounding perfectlyrandom universe acts as an interfering entity which introduces local disruption in randomness. This is modeled by a selection rule R which selects a subsequence of the random input sequence that hits the structure. Through the inequality that relates stochasticity and chaoticity of random binary sequences we maintain that Lin’s notion of stability corresponds to the stability of the frequency of 1s in the selected subsequence. This explains why more complex static structures are less stable. Lin’s third law is represented as the inevitable change that static structure undergo towards conforming to the universe’s perfect randomness.
Information and Entropy in Cybernetic Systems
"... Abstract. It has been shown that the cybernetic approaches can efficiently be used for analysis and design of complex networked systems. Still, the earlier discussions were bound to the actual application domain at hand. This paper gives more intuition in what truly takes place in a cybernetic syste ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. It has been shown that the cybernetic approaches can efficiently be used for analysis and design of complex networked systems. Still, the earlier discussions were bound to the actual application domain at hand. This paper gives more intuition in what truly takes place in a cybernetic system from another point of view. Information theory, and specially the concept of entropy, offer a yet more general perspective to such analyses. 1
Similarity and Their Relationship
, 2008
"... We are publishing volume 10 of Entropy. When I was a chemistry student I was facinated by thermodynamic problems, particularly the Gibbs paradox. It has now been more than 10 years since I actively published on this topic [14]. During this decade, the globalized Information Society has been develop ..."
Abstract
 Add to MetaCart
We are publishing volume 10 of Entropy. When I was a chemistry student I was facinated by thermodynamic problems, particularly the Gibbs paradox. It has now been more than 10 years since I actively published on this topic [14]. During this decade, the globalized Information Society has been developing very quickly based on the Internet and the term “information ” is widely used, but what is information? What is its relationship with entropy and other concepts like symmetry, distinguishability and stability? What is the situation of entropy research in general? As the EditorinChief of Entropy, I feel it is time to offer some comments, present my own opinions in this matter and point out a major flaw in related studies. Definition of Information We are interested in the definition of information in the context of information theory. It is a surprise that a clear definition of the concept of “information ” cannot be found in information theory textbooks. “Entropy as a measure of information ” is confusing. I would like to propose a simple definition of information: Information ( I) is the amount of the data after data compression. If the total amount of data is L, entropy ( S) in information theory is defined as information loss, L = S + I. Let us consider a 100GB hard disk as an example: L = 100GB. A formatted hard disk will have S = 100GB and I = 0. Similar examples for defining information as the amount of data after compression are given in [5]. Based on this definition of information and the definition that (information theory) entropy is expressed as information loss, S = L − I, or in certain cases when the absolute values are unknown, Δ S =ΔL−Δ I, I was able to propose three laws of information theory [5]:
arXiv:0807.4314v1 [physics.genph]. DOI: 10.1063/1.303902249 Gibbs Paradox and Similarity Principle
"... Abstract. As no heat effect and mechanical work are observed, we have a simple experimental resolution of the Gibbs paradox: both the thermodynamic entropy of mixing and the Gibbs free energy change are zero during the formation of any ideal mixtures. Information loss is the driving force of these s ..."
Abstract
 Add to MetaCart
Abstract. As no heat effect and mechanical work are observed, we have a simple experimental resolution of the Gibbs paradox: both the thermodynamic entropy of mixing and the Gibbs free energy change are zero during the formation of any ideal mixtures. Information loss is the driving force of these spontaneous processes. Information is defined as the amount of the compressed data. Information losses due to dynamic motion and static symmetric structure formation are defined as two kinds of entropies – dynamic entropy and static entropy, respectively. There are three laws of information theory, where the first and the second laws are analogs of the two thermodynamic laws. However, the third law of information theory is different: for a solid structure of perfect symmetry (e.g., a perfect crystal), the entropy (static entropy for solid state) S is the maximum. More generally, a similarity principle is set up: if all the other conditions remain constant, the higher the similarity among the components is, the higher the value of entropy of the mixture (for fluid phases) or the assemblage (for a static structure or a system of condensed phases) or any other structure (such as quantum states in quantum mechanics) will be, the more stable the mixture or the assemblage will be, and the more spontaneous the process leading to such a mixture or an assemblage or a chemical bond will be.