Infinite Divisibility of Information Cheuk Ting Li Department of Information Engineering The Chinese University of Hong Kong Email:
[email protected] Abstract We study an information analogue of infinitely divisible probability distributions, where the i.i.d. sum is replaced by the joint distribution of an i.i.d. sequence. A random variable X is called informationally infinitely divisible if, for any n ≥ 1, there exists an i.i.d. sequence of random variables Z1,...,Zn that contains the same information as X, i.e., there exists an injective function f such that X = f(Z1,...,Zn). While there does not exist informationally infinitely divisible discrete random variable, we show that any discrete random variable X has a bounded multiplicative gap to infinite divisibility, that is, if we remove the injectivity requirement on f, then there exists i.i.d. Z1,...,Zn and f satisfying X = f(Z1,...,Zn), and the entropy satisfies H(X)/n ≤ H(Z1) ≤ 1.59H(X)/n + 2.43. We also study a new class of discrete probability distributions, called spectral infinitely divisible distributions, where we can remove the multiplicative gap 1.59. Furthermore, we study the case where X =(Y1,...,Ym) is itself an i.i.d. sequence, m ≥ 2, for which the multiplicative gap 1.59 can be replaced by 1+5p(log m)/m. This means that as m increases, (Y1,...,Ym) becomes closer to being spectral infinitely divisible in a uniform manner. This can be regarded as an information analogue of Kolmogorov’s uniform theorem. Applications of our result include independent component analysis, distributed storage with a secrecy constraint, and distributed random number generation.