Multiplicative Functions of Numbers Set and Logarithmic Identities

Multiplicative Functions of Numbers Set and Logarithmic Identities

Trends Journal of Sciences Research (2015) 2(1):13-16 http://www.tjsr.org Research Paper Open Access Multiplicative Functions of Numbers Set and Logarithmic Identities. Shannon and factorial logarithmic Identities, Entropy and Coentropy Yu. A. Kokotov Agrophysical Institute, Sankt- Petersburg, 195220, Russia *Correspondence: Yu. A. Kokotov([email protected]) Abstract The multiplicative functions characterizing the finite set of positive numbers are introduced in the work. With their help we find the logarithmic identities which connect logarithm of sum of the set numbers and logarithms of numbers themselves. One of them (contained in the work of Shannon) interconnects three information functions: information Hartley, entropy and coentropy. Shannon's identity allows better to understand the meaning and relationship of these collective characteristics of information (as the characteristics of finite sets and as probabilistic characteristics). The factorial multiplicative function and the logarithmic factorial identity are formed also from initial set numbers. That identity connects logarithms of factorials of integer numbers and logarithm of factorial of their sum. Keywords: Mumber Sets, Multiplicative Functions, Logarithmic Identity, The Identity of Shannon, Information Entropy, Coentropy, Factorials Yu. A. Kokotov. Multiplicative Functions of Numbers Set and Logarithmic Identities. Citation: Shannon and factorial logarithmic Identities, Entropy and Coentropy. Vol. 2, No. 1, 2015, pp.13-16 1. Introduction In information theory the logarithm of the sum, M, and the logarithms of its terms are considered as a The logarithmic scale is widely used in many areas of measure of the information. These ones are respectively science, including information theory. In this theory the measure of information about the set as a whole positive numbers logarithms are treated as Hartley (Hartley information, I0), and of information about each information about a set divided into subsets.The widely of the subsets (partial information, Ii). Therefore the used logarithmic characteristic is information entropy. relation between these values is important for the theory The statistical entropy is also widely used in statistical of information. physics (sometimes regarded from point of view of And the product and the sum can be considered as theory of information). In this regard, the following common integral characteristics of a set of numbers. It problem is interesting. is evident that their ratio It is known that the logarithm of number considered П as the sum is not divided into the sum of logarithms of (1) her terms. Direct connection between these logarithms M does not exist. However there are formulas that include can also be seen as another its characteristic. It logarithm of the sum and logarithms of all terms of sum. follows immediately from the above mentioned This suggests the existence of some causal relationship relations between them. The work is devoted to analysis of this issue and its N relationship with information theory. logMm logi log (2) i1 You can enter some other characteristics of set, Results and Discussion which are convenient for taking the logarithm. Let's call them “multiplicative function of set”. Suppose we have a set of N positive numbers: m1 ... We may introduce the function formed from mN. This set may be characterized in different ways. So frequently used quantities: the geometric mean and the these numbers can simultaneously be viewed as terms arithmetic mean: the sum of which is equal to M, and as the factors forming the product, П. 14 Yu. A. Kokotov: Multiplicative Functions of Numbers Set and Logarithmic Identities. Shannon and factorial logarithmic Identities, Entropy and Coentropy N N N m i SN = log pi log p i 1 i1 1 N (Information entropy)(13) m i N N i1 (3) I (Coentropy)1 (14) JNi piilog m This ratio is discussed usually in the statistics. i1 Taking the logarithm, we obtain the identity: This identity determines the information about the set 1 N as a whole as the difference between the two average logM log mi log logN ; values (arithmetic mean of the partial information about N i1 (4) subsets and arithmetic mean of the logarithms of the weights of terms) or as sum of coentropy and entropy. The values of all N values mi completely determine Entropy is usually considered in the theory of all quantities in these identities. Any set of terms in the information as a common characteristic of set divided sum of M can be transformed and represented in into subsets. The values S and J for a fixed value M are normalized form with weights of subsets, pi: interrelated and change in the opposite direction. m N Equation (10) was obtained on the basis of i probability theory in the work of Shannon ([1], App. 2), ppii; 1; M i1 (5) as the mathematical foundation of intuitively introduced entropy. Information entropy is generally considered as The simplest characteristic of this set is another closely related to the probability theory applied to a multiplicative function, product of weights: finite set of events. Until recently, eq. (10) was almost N forgotten and was again got and used in the works Bianucci et al. [2] and Vyatkin [3]. It can be correct to mi N name equation (10): Shannon identity. p i1 (6) i M N It can be seen through the mathematical expectations 1 of the corresponding quantities: It follows another logarithmic identity, in fact, equivalent to eq. (4): logM abs log pii J ; (15) 1 N In this case, the entropy becomes mathematical sense logM log m log ; N i of the probability function of the random quantities ([4], i1 (7) Chapter 10). However, the eq. (8) and functions S and J can be We can introduce more complex multiplicative used for describing the properties of numerical sets and weight functions such as out of touch with the theory of probability. N The function J named in [2] "coentropy" also proved pi important for the theory of information. Coentropy mi m NNM i averages partial information about the subsets and pi m ii1 =;pi (8) therefore it is the proper measure of information about a ii11MM particular set. Entropy averages the information about weights, pi , Logarithm of it is more complex logarithmic identity: i.e. really only about relation of subsets in the given set. N Both quantities are considered as the general logM p log m log (9) characteristics of the set and as the collective ii characteristics of the information obtained from the i1 experiment. This identity is equivalent to the equation: NN logM pi log m i p i log p i . ii11 (10) Eq. (9) and (10) can be written" in information form": I ()М J (,,... M N p p ) S (,... N p p ) 1 This term is sometimes used in the literature in 0 N 1NN 1 1N 1 (11) relation to different values, but not spread. I( N , M , m ... m ) ( N , p ... p ) i 1N 1 i 1N 1 Proposed by Vyatkin [3] the term "negentropy" is not where quite successful, because determined by it value differs IM0 log (Hartley information) (12) from the actual "negative entropy" by a constant value. Trends Journal of Sciences Research 2015, 2(1): 13-16 15 In the probabilistic interpretation the weighting On the other hand, as shown in [3], the minimal factors ("weights") are frequencies of events and are entropy of integer set is attained at the most non- ttransformed to probabilities only in the limit case. uniform distribution of elements across subsets (the all The values log M, log mi and hence coentropy subsets except one contain only one element): become meaningless at infinite M and m . Then the i MN1 entropy is the only characteristic of a hypothetical SMmin log logMN 1 ; (20) infinite set of test results divided into N infinite subsets. N M All sets, finite or infinite, with the same relationship of In both cases, extreme value of coentropy is terms, i.e. with the same structure, have the same determined from the identity Shannon. entropy. Shannon identity determines the difference Note that the entropy can not be considered as between information about all the set as a whole and characteristic of the individual set with a given structure. information of the same set, divided into subsets. Entropy is ambiguous due his constituent, the function y Obviously that the entropy is the difference of = plogp. It has the maximum at p =0.37. Contributions information between set as a whole and the same set to the entropy of terms with the values p, greater and divided into subsets [3]. In this way it determines the less than the 0.37 and, moreover, with a very large and change of information in result of division of set into very small values of p are equivalent, and in the latter subsets. The usual interpretation of entropy as "the case are small. The terms with p values in the measure of uncertainty of results of the test having neighborhood 0.37 give the largest contribution to the different probability” is much vaguer. entropy. Therefore, very different distributions can have Note that identity Shannon can be written in a purely the same values of entropy. For the same reasons entropic form: ambiguity is also characteristic for coentropy. N There may be introduced and other options of S S J S p S S МО S MNNN i miiN m N multiplicative functions, such as functions: i1 (16) NNN where qi qi pi , associated with values q = qi,, p i q i i1 i 1 i 1 11 1- p, sometimes used in probability theory. It follows SMM log М ( log ) ; МM logarithmic identities with components: . 11 qilog q , q i log p i , p i log q i S I log m m log (17) mii m i i The logarithmic function is symmetric mmii qqlog are entropy of sets M as whole (Hartley information) relative to functions pplog and has a maximum at and entropy of subsets (partial Hartley information).

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    4 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us