The Nature of the Chemical Process. 1. Symmetry Evolution – Revised Information Theory, Similarity Principle and Ugly Symmetry Shu-Kun Lin Molecular Diversity Preservation International (MDPI), Sangergasse 25, Basel CH-4054 Switzerland. Tel. ++41 79 322 3379, Fax: ++41 61 302 8918, E-mail:
[email protected], http://www.mdpi.org/lin Abstract: Symmetry is a measure of indistinguishability. Similarity is a continuous measure of imperfect symmetry. Lewis' remark that “gain of entropy means loss of information” defines the relationship of entropy and information. Three laws of information theory have been proposed. Labeling by introducing nonsymmetry and formatting by introducing symmetry are defined. The function L ( L=lnw, w is the number of microstates, or the sum of entropy and information, L=S+I) of the universe is a constant (the first law of information theory). The entropy S of the universe tends toward a maximum (the second law law of information theory). For a perfect symmetric static structure, the information is zero and the static entropy is the maximum (the third law law of information theory). Based on the Gibbs inequality and the second law of the revised information theory we have proved the similarity principle (a continuous higher similarity-higher entropy relation after the rejection of the Gibbs paradox) and proved the Curie-Rosen symmetry principle (a higher symmetry-higher stability relation) as a special case of the similarity principle. The principles of information minimization and potential energy minimization are compared. Entropy is the degree of symmetry and information is the degree of nonsymmetry. There are two kinds of symmetries: dynamic and static symmetries.