## Correlation of entropy with similarity and symmetry (1996)

Venue: | Journal of Chemical Information and Computer Sciences |

Citations: | 8 - 4 self |

### BibTeX

@ARTICLE{Lin96correlationof,

author = {Shu-kun Lin},

title = {Correlation of entropy with similarity and symmetry},

journal = {Journal of Chemical Information and Computer Sciences},

year = {1996},

volume = {36},

pages = {367--376}

}

### Years of Citing Articles

### OpenURL

### Abstract

Informational entropy is quantitatively related to similarity and symmetry. Some tacit assumptions regarding their correlation have been shown to be wrong. The Gibbs paradox statement (indistinguishability corresponds to minimum entropy, which is zero) has been rejected. All their correlations are based on the relation that less information content corresponds to more entropy. Higher value of entropy is correlated to higher molecular similarity. The maximum entropy of any system (e.g., a mixture or an assemblage) corresponds to indistinguishability (total loss of information), to perfect symmetry or highest symmetry, and to the highest simplicity. This conforms without exception to all the experimental facts of both dynamic systems and static structures and the related information loss processes. 1.