
IEEE ~NSACnONS ON INFORMATION THRORY, VOL. m-20, NO. 4, JULY 1974 517 On the &-Entropy and the Rate-Distortion Function of Certain Non-Gaussian Processes JACOB BINIA, MOSHE ZAKAI, FELLOW, IEEE AND JACOB ZIV, FELLOW, IEEE Abstract--Let C = {5(t), 0 5 t I T} be a process with covariance and (2) to random processes or to infinite-dimensional function K(s,t ) and E jf t'(t) dt < co. It is proved that for every E > 0 random variables is impossible, since the results are based the E-entropy H,(T) satisfies on the existence of a probability density. Namely, the probability measure of 5 is required to be absolutely where & is a Gaussian process with the covarhmce K&t) and &‘;,(T) is continuous with respect to the Lebesguemeasure. In this the entropy of the measure induced by c (in function space) with respect paper we replace this requirement by a requirement of to that induced by &,. It is also shown that if ze,(T) < co, then, as E + 0 absolute continuity with respect to a Gaussian measure. This enables us to prove the following bounds on the s-entropy of a random process 5 = {c(t), 0 I t < T} : Furthermore, if there exists a Gaussian process g = {g(t); 0 5 t 5 T} such that y%(T) < co, then the ratio between H,(5) and H,(g) goes to one f4(5,) - qJ3 5 f&(C) 5 f&(5,) (3) as E goes to zero. Similar results are given for the rate-distortion function, and some particular examples are worked out in detail. Some cases for where 5, is the Gaussian processwith the same covariance which A$,(@ = 00 are discussed, and asymptotic bounds on H,(T), as that of 5 and &&(e) is the relative entropy of the measure expressed in terms of H&&J, are derived. induced by < with respectto that induced by 5, (cf., Section II). Furthermore, if Xc,(t) < co, then we show that for I. INTRODUCTION E-+0 HE e-ENTROPY (and its related normalized form, f&(r) = f&(5> - JqJ3 + 4). (4) the rate-distortion function) provides an important mathematicalT tool for the analysis of communication In fact, for a finite-dimensional random variable 5, the sources and systems. Given a communication system, the preceding lower bound on H,(t), as well as the a symptotic s-entropy of the source and the channel capacity yield a behavior (4) are stronger than previously known results lower bound on the minimum attainable distortion [l]. (the upper bound on H,(e), for N-dimensional random Let 5 be a real-valued random variable and denote by variables, is already known; [l, sec. 4.6.21. H,(c) the s-entropy (rate-distortion function) of 5, relative The results in (3) and (4) on H,(c) are expressedin terms to a mean-square-error criterion. A well-known result of of the s-entropy of a related Gaussian process H,(&) and Shannon [2], [3] states that, if the probability distribution the entropy of the measureof t; with respect to the measure of 5 possessesa density, then of the related Gaussian processt,. Results on the a-entropy of Gaussian processesare available in the literature [l], 1 tT2 [6]-[lo]. The entropy Xg(l) of 5 with respect to some h(5) + $ In -2nee2 5 K(t) 5 3 ln ~z (1) Gaussian processesg can be derived from known results on the Radon-Nikodym derivative of certain processeswith where r? denotes the variance of 5 and h(5) = -f P&C) * respectto certain Gaussianprocesses [ 1I]-[ 131.It is shown In P&X) dx. Furthermore, it was shown by Gerrish and that if there exists a Gaussian processg for which y10,(5)< Schultheiss [4] that as E -+ 0 co, then &$(<) < co. Therefore the bounds (3) also yield asymptotic results, for E + 0. Several examples are given, f&(t) = ME)+ 3 ln & + 41). and in one of these examples we show that if d{(t) = c(t) dt + p &v(t) (where w(t) is a standard Brownian Results of the type (1) and (2) have been extended for motion) then as E + 0 N-dimensional random variables, provided that the random variables possessa probability density [l], [4], and [5]. The purpose of this paper is to derive results of the type (1) and (2) for random processes.A direct extension of (1) which is a generalization of previously known results Manuscript received June 5, 1973; revised January 2,. 1974. PI, WI. J. Binia is with the Armament Development Authority, Ministry of Section II is devoted to notation and certain preliminary Defense, Israel. results. The main results and examplesare given in Section M. Zakai and J. Ziv are with the Faculty of Engineering, Technion- Israel Institute of Technology, Israel. III, and their proofs are given in Section IV. 518 IEEE TRANSACTIONS ON INFORMATION THEORY, JULY 1974 II. PRELIMINARIES AND NOTATION Then we have Let P, and P, be two probability measures defined on a measurablespace (C&B)and let {Ei} be a finite B-measurable partition of R. The entropy &‘pzCP1) of P, with respect to Remarks: a) From now on, the discussion is limited to P, is defined by [14, ch. 21 the metrics of L2,12. Although the results will be stated for processesin L’[O,T], they hold for all processesdefined in &&(PI) = sup C P,(E,) In z spaces isometric to L’[O,T]. b) The random function E(t), 1 2 i 0 I t I Twill be assumedto be measurable, separable, and satisfying E JE (f’(t) dt < co. Also, since the a-entropy is where the supremum is taken over all partitions of R. Let independent of the process mean, we always assume P, and P, be the distributions of the random variables t and q (taking values in the same space (X,B,.)), respectively. E<(t) = 0. Lemma 2.2: Let c,&,, n = 1,2,* * . be a sequence of In this case the notation for the entropy will be random processessuch that Jq5) = G,(Py) = xP*(Pl). lim E ,,T [@(t) - 5,(t)]’ dt = 0. (5) Let 5 and r] be random variables with values in the n+cc s measurable spaces (X,B,) and (YJ,,), respectively. The mutual information 1(t,r) between these variables is Then for every E > 0 defined as [14, ch. 21 lim K(L) = f&(4>. (6) "-CC Z(t,v) = sup c PCq(Ei x Fj) In ‘@I(~’ ’ Fj) Proof: We first show that if c1 and c2 are random i,j P<(Ei>Pq(Fj) processessuch that where the supremum is taken over all partitions {Ei} of X and (Fj} of Y. E 0T [(l(t) - t2(t)12 dt 5 8’ (7) The s-entropy H,(r) of a random variable 5 taking values s in a metric space (X,&) with a metric p(x,y), x,y E X, is then defined as f&+&2) 5 W51) 5 fLdt2)~ & 2 8 > 0. (8) By the definition of the s-entropy, there exists a process YI where the infimum is taken over all joint distributions Ps, such that (defined on (X x X, B, x B,.)), with a fixed marginal E 0T [cl(t) - q(t)]’ dt I E2 (9) distribution P,;, such that s EP2(5,1) s E2. and The following lemma shows that the s-entropy is inde- f&(51) 2 z(tl,r) - 6 (10) pendent of the representation of a process e in different where 6 > 0 is arbitrary. Note that we can choose a version isometric spaces. of the preceding process VIsuch that r~and t2 are condition- Lemma 2.1: Let A be a linear, one-to-one and bi- ally independent, conditioned on ti ; therefore [14, sect. 3.41 measurable transformation from the metric space (X&J onto the metric space@,&) and let Pt; be the distribution G52,d 5 N52,51M = I<t,,v> + Mt2,1 I 51) = Z(td. of g in (X,&J. We define the distribution Pe of t by (11) P,@) = P&E-l), where E -’ = A-‘(,!?), E E B;. If the Furthermore, by (7) and (9) metrics p and p”, defined in X and r?, respectively, satisfy E 0T [g,(t) - q(t)]” dt I (E + 0)‘. (12) s then Therefore, by (lo)-( 12) The proof of Lemma 2.1 follows directly from the trans- %+,(tz) 5 m2Pd s mdi9 5 m-1) + 6. formation properties, since such a transformation preserves This completes the proof of the left side of (8); the right the amount of information [15]. side of (8) follows by a similar argument. In particular, let 5 = {g(t), 0 I t I T} be a stochastic Equation (6) follows from (5), (8) and the continuity of process with E jz c2(t) dt < co. Let {pi} be a complete H,(t) as a function of E. orthonormal system in I?[O,T], and consider the following Remark: Let g = {t(t), 0 I t I T} be a process with transformation A from L’[O,T] to 1’: E jt t’(t) dt c co. Suppose that < is passed through a A: x(t) + (x1,x2, * . -1, x(t) E L’[O,T] linear filter with transfer function h(z), given by where 1 0<t56 T h(z) = s ’ Xi = x(tMi(t) dt, i = 1,2;**. s 0 0, otherwise. BINIA et al. : &-ENTROPY AND RATE-DISTORTION 519 Denote by & the filter output in [O,T]. Then obviously & and is a quadratic-mean-continuousprocess and Note that since the entropy is always nonnegative [14], liiisoT[t(t) - &(t)] d=t0 2.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages8 Page
-
File Size-