On Entropy, Information, and Conservation of Information
Total Page:16
File Type:pdf, Size:1020Kb
entropy Article On Entropy, Information, and Conservation of Information Yunus A. Çengel Department of Mechanical Engineering, University of Nevada, Reno, NV 89557, USA; [email protected] Abstract: The term entropy is used in different meanings in different contexts, sometimes in contradic- tory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermody- namics as an expression of the second law. The aim of this paper is to clarify matters and eliminate confusion by putting things into their rightful places within their domains. The notion of conservation of information is also put into a proper perspective. Keywords: entropy; information; conservation of information; creation of information; destruction of information; Boltzmann relation Citation: Çengel, Y.A. On Entropy, 1. Introduction Information, and Conservation of One needs to be cautious when dealing with information since it is defined differently Information. Entropy 2021, 23, 779. in different fields such as physical sciences, biological sciences, communications, philos- https://doi.org/10.3390/e23060779 ophy, and daily life. Generalization of a particular meaning into other fields results in incompatibilities and causes confusion. Therefore, clear distinction with precise language Academic Editors: Milivoje M. Kostic and mathematics is needed to avoid misunderstandings. and Jean-Noël Jaubert The basic building block of modern electronics is the transistor circuit, which acts as a switch capable of staying in one of two states of on-or-off, or equivalently, closed-or-open, Received: 27 May 2021 or 1-or-0 in the binary code, in response to an electrical signal input. A transistor functions Accepted: 17 June 2021 in the same way as the light switch shown in Figure1: the lamp turns on when the switch Published: 19 June 2021 is moved to the on position and electric current is allowed to pass, and the lamp turns off when the switch is moved to the off position. Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in In electronics, a single switch or transistor constitutes the most elementary hardware. published maps and institutional affil- This basic case of one elementary component n = 1 corresponds to two sates of on and off, iations. M = 2. In the area of communication via signal transmission, the information I associated with a single basic switch being on or off corresponds to a binary digit, and thus to a choice between two messages, and is taken as the unit of information called a bit. Therefore, 1 bit represents the amount of information that can be stored by a single switch with two stable states or positions of on/off, closed/open, or 1/0. The bit also represents the basic unit of Copyright: © 2021 by the author. entropy for a system that has only two states. Licensee MDPI, Basel, Switzerland. This article is an open access article If one switch corresponds to one bit of information, we intuitively feel that information distributed under the terms and should be additive, and three switches should correspond to three bits of information. 3 conditions of the Creative Commons However, in the case of three switches, there are 2 = 8 possible choices or combinations of Attribution (CC BY) license (https:// 111, 110, 101, 100, 001, 010, 011, and 000 for messages, the last choice corresponding to the creativecommons.org/licenses/by/ case of all three switches being off. 4.0/). Entropy 2021, 23, 779. https://doi.org/10.3390/e23060779 https://www.mdpi.com/journal/entropy Entropy 2021, 23, x FOR PEER REVIEW 2 of 19 Entropy 23 2021, , 779 mation. However, in the case of three switches, there are 23 = 8 possible choices or combi-2 of 19 nations of 111, 110, 101, 100, 001, 010, 011, and 000 for messages, the last choice corre- sponding to the case of all three switches being off. FigureFigure 1. 1. AnAn electric electric lamp lamp switch switch has has two possible states only: on andand offoff (or,(or, 1 1 and and 0). 0). If If the the lamp lamp is islit, lit, we we know know that that the the switch switch is is in in the theon onposition position (Photos (Photos by by Yunus Yunus Çengel). Ç engel). Therefore,Therefore, as as the the number number of of switches switches increases increases linearly, linearly, the the number number of of pos possiblesible mes- mes- sages,sages, which which is is equivalent equivalent to to the the amount amount of of information information that that can can be be conveyed, conveyed, increases increases exponentially.exponentially. To To keep keep things things simple simple and and manageable, manageable, it it is is desirable desirable to to define define information information inin such such a a way way that that the the amount amount of of information information is is linearly linearly related, related, an andd thus thus directly directly propor- propor- tionaltional,, to to the the amount amount of of hardware hardware—the—the numbernumber ofof switchesswitches oror transistors transistors in in this this case. case. After Af- terall, all, the the amount amount of of hardware hardware is is a a definitive definitive indicator indicator ofof thethe amountamount of information. This This way,way, the the amount amount of hardware becomes aa measuremeasure ofof information,information, which which is is very very practical practical if ifwe we are are to to build build communication communicationsystems systems withwith variousvarious numbersnumbers ofof transistors or switches. AA convenient convenient and and intuitive wayway ofof quantifyingquantifying information information is is to to relate relate it linearlyit linearly to theto thenumber number of switchesof switches so thatso that information information doubles doubles when when the the number number of switchesof switches is doubled. is dou- The simplest way of doing this is to equate one switch to one unit of information. Following bled. The simplest way of doing this is to equate one switch to one unit of information. the suggestion of Hartley [1], Shannon [2] and Weaver [3] have formulated this idea by Following the suggestion of Hartley [1], Shannon [2] and Weaver [3] have formulated this defining information I associated with n switches as the logarithm to the base 2 of the idea by defining information I associated with n switches as the logarithm to the base 2 of number of possible choices or messages N: the number of possible choices or messages N: n I =I log= log2 N2 N= = log log222 2n = n log2 22 = = nn (bits)(bits) ( (1)1) sincesince log log2 2 2= 1, = 1,as asshown shown in Figure in Figure 2. I2n. the In thecase case of a ofsingle a single switch, switch, n = 1 andn = thus 1 and infor- thus mationinformation is I = 1 isbit.I =Then, 1 bit. as Then, desired, as desired, the information the information associated associated with n = 3 with switchesn = 3 becomes switches becomes 3 I = log2 N = log2 8 = log2 23 = 3 log2 2 = 3 bits (2) I = log2 N = log2 8 = log2 2 = 3 log2 2 = 3 bits (2) The unit bit may appear obscure and arbitrary at first. However, this is also the case for many familiar units such as meter, gram, and second. Besides, it is common to use constant multiples of a unit in place of a unit itself, such as using kilogram instead of gram as a unit of mass and using byte (equivalent to eight bits) or even gigabytes (1 billion bytes) instead of bit as a unit of information and storage. Additionally, note from Figure2 that log 2 x = 3.322 log10 x and thus the logarithm of any base can be used in information calculations in practice, including a natural logarithm, by incorporating a suitable constant factor of multiplication. EntropyEntropy 20212021,, 2323,, x 779 FOR PEER REVIEW 33 of of 19 19 FigureFigure 2. 2. SomeSome mathematical mathematical relations relations regarding regarding logarithms logarithms (no (no specified specified base base indicates indicates any any base) base).. TheA more unit generalbit may andappear practical obscure approach and arbitrary is to define at first. information However, in this terms is ofalsoprobabilities the case forp of many messages familiar instead units of such the numberas meter, of gram, choices and for second. messages, Besides, as presented it is common next. Notingto use equiprobable constantthat the multiples probability of ofa unit a choice in place or message of a unit initself, this such as usingcase kilogram with threeinstead switches of gram is p = 1/N = 1/8, as a unit of mass and using byte (equivalent to eight bits) or even gigabytes (1 billion bytes) instead of bit as a unit of information and storage. Additionally, note3 from Figure 2 that I = log2 N = log2 (1/p) = − log2 p = − log2 (1/8) = log2 2 = 3 bits (3) log2 x = 3.322 log10 x and thus the logarithm of any base can be used in information calcu- lationssince again, in practice, log2 2 including = 1.