<<

Power Domains and Iterated



Systems

Abbas Edalat

Department of Computing

Imp erial College of Science Technology and Medicine

Queens Gate

London SW BZ UK

Abstract

We intro duce the notion of weakly hyp erb olic

IFS on a compact metric space which generalises that of hyp erb olic

IFS Based on a domaintheoretic mo del which uses the Plotkin p ower

domain and the probabili sti c p ower domain resp ectively we prove the

existence and uniqueness of the attractor of a weakly hyp erb olic IFS

and the of a weakly hyp erb olic IFS with probabili ties

extending the classic results of Hutchinson for hyp erb olic IFSs in this

more general setting We also present nite to obtain discrete

and digitised approximation s to the attractor and the invariant measure

extending the corresp onding algorithms for hyp erb olic IFSs We then

prove the existence and uniqueness of the invariant distributio n of a

weakly hyp erb olic recurrent IFS and obtain an to generate the

invariant distribution on the digitised screen The generalised Riemann

integral is used to provide a formula for the exp ected value of almost

everywhere continuous functions with resp ect to this distribution For

hyp erb olic recurrent IFSs and Lipschitz maps one can estimate the

integral up to any threshold of accuracy

Intro duction

The theory of iterated function systems has b een an active area of research

since the seminal work of Mandelbrot on and selfsimilarity in

nature in late seventies and early eighties The

theory has found applications in diverse areas such as computer graphics

image compression learning automata neural nets and statistical physics

In this pap er we will b e mainly concerned with the basic theoretical work

of Hutchinson and a numb er of algorithms based on this work We

start by briey reviewing the classical work See for a comprehensive

intro duction to iterated function systems and fractals

Iterated Function Systems

An iterated function system IFS fX f f f g on a top ological space X

N

is given by a nite set of continuous maps f X X i N If X is

i



Submitted to Information and Computation

a complete metric space and the maps f are all contracting then the IFS is

i

said to b e hyperbolic For a complete metric space X let HX b e the complete

metric space of all nonempty compact subsets of X with the Hausdor metric

d dened by

H

d A B inf f j B A and A B g

H

where for a nonempty compact subset C X and the set

C fx X j y C dx y g

is the paral lel body of C

An hyp erb olic IFS induces a map

F HX HX

dened by F A f A f A f A In fact F is also contracting

N

with contractivity factor s max s where s is the contractivity factor of

i i i

f i N The numb er s is called the contractivity of the IFS By the

i



contracting mapping theorem F has a unique xed p oint A in HX which is

called the attractor of the IFS and we have

 n

A lim F A

n!1

for any nonempty compact subset A X The attractor is also called a

selfsimilar set

For applications in graphics and image compression it is

assumed that X is the plane R and that the maps are contracting ane

transformations Then the attractor is usually a ie it has ne

complicated and nonsmo oth lo cal structure some form of selfsimilarity and

usually a nonintegral Hausdor dimension A nite algorithm to generate a

discrete approximation to the attractor was rst obtained by Hepting et al

in See also It is describ ed in Section

IFS with Probabilities

There is also a probabilistic version of the theory that pro duces invariant

probability distributions and as a result coloured images in computer graphics

An hyp erb olic IFS with probabilities fX f f p p g is an hyp erb olic

N N

IFS fX f f f g with X a compact metric space such that each f

N i

i N is assigned a probability p with

i

N

X

p p and

i i

i

Then the Markov operator is dened by

T M X M X

on the set M X of normalised Borel measures on X It takes a Borel measure

M X to a Borel measure T M X given by

N

X

B p f T B

i

i

i

for any Borel subset B X When X is compact the Hutchinson metric r

H

can b e dened on M X as follows

Z Z

r sup f f d f d j f X R jf x f y j dx y x y X g

H

X X

Then using some Banach space theory including Alaoglus theorem it is

shown that the weak top ology and the Hutchinson metric top ology on M X

coincide thereby making M X r a compact metric space If the IFS is

H



hyp erb olic T will b e a contracting map The unique xed p oint of T

then denes a probability distribution on X whose supp ort is the attractor of



fX f f g The measure is also called a selfsimilar measure or

N

n

a multifractal When X R this invariant distribution gives dierent p oint

densities in dierent regions of the attractor and using a colouring scheme

one can colour the attractor accordingly A nite algorithm to generate a

discrete approximation to this invariant measure and a formula for the value

of the integral of a continuous function with resp ect to this measure were also

obtained in they are describ ed in Sections and resp ectively

The random algorithm for an IFS with probabilities is based

on the following ergo dic theorem of Elton Let fX f f p p g

N N

b e an IFS with probabilities on the compact metric space X and let x X

b e any initial p oint Put f N g with the discrete top ology Cho ose

i at random such that i is chosen with probability p Let x f x

i i

1

Rep eat to obtain i and x f x f f x In this way construct the

i i i

2 2 1



hx i Supp ose B is a Borel subset of X such that B

n n



where is the invariant measure of the IFS and B is the b oundary of B

Let Ln B b e the numb er of p oints in the set fx x x g B Then

n

Eltons Theorem says that with probability one ie for almost all

hx i we have

n no

Ln B



B lim

n!1

n

Moreover for all continuous functions g X R we have the following

convergence with probability one

P

Z

n

g x

i



i

g d lim

n!1

n

which gives the exp ected value of g

Recurrent IFS

Recurrent iterated function systems generalise IFSs with probabilities as

follows Let X b e a compact metric space and fX f f f g an

N

hyp erb olic IFS Let p b e an indecomp osable N N rowsto chastic matrix

ij

ie

P

N

p for all i

ij

j

p for all i j and

ij

for all i j there exist i i i with i i and i j such that

n n

p p p

i i i i i i

1 2 2 3 n1 n

Then fX f p i j N g is called an hyp erb olic recurrent IFS For

j ij

an hyp erb olic recurrent IFS consider a random walk on X as follows Sp ecify

a starting p oint x X and a starting co de i Pick a numb er i

such that p is the conditional probability that j is chosen and dene

i j

0

x Then pick i such that p is the conditional probability x f

i j i

1 1

x Continue to obtain the sequence f x f that j and put x f

i i i

1 2 2

hx i The distribution of this sequence converges with probability one to

n n

a measure on X called the stationary distribution of the hyp erb olic recurrent

IFS This generalises the theory of hyp erb olic IFSs with probabilities In

fact if p p is indep endent of i then we obtain an hyp erb olic IFS with

ij j

probabilities the stationary distribution is then just the invariant measure

and the random walk ab ove reduces to the random iteration algorithm The

rst practical software system for fractal image compression Barnsleys VRIFS

Vector Recurrent Iterated Function System which is an interactive image

mo delling system is based on hyp erb olic recurrent IFSs

Weakly hyp erb olic IFS

In p ower domains were used to construct domaintheoretic mo dels for IFSs

and IFSs with probabilities It was shown that the attractor of an hyp erb olic

IFS on a compact metric space is obtained as the unique xed p oint of a

continuous function on the Plotkin p ower domain of the upp er space Similarly

the invariant measure of an hyp erb olic IFS with probabilities on a compact

metric space is the xed p oint of a continuous function on the probabilistic

p ower domain of the upp er space

We will here intro duce the notion of a weakly hyp erb olic IFS Our denition

is motivated by a numb er of applications for example in neural nets

where one encounters IFSs which are not hyp erb olic This situation can arise

for example in a compact interval X R if the IFS contains a smo oth map

0 0

f X X satisfying jf xj but not jf xj

Let X d b e a compact metric space we denote the diameter of any set

a X by jaj supfdx y j x y ag As b efore let f N g with

the discrete top ology and let b e the set of all innite sequences i i i

i for n with the pro duct top ology

n

Denition An IFS fX f f f g is weakly hyperbolic if for all innite

N

X j f f sequences i i we have lim jf

i i n!1 i

n 2 1

Weakly hyp erb olic IFSs generalise hyp erb olic IFSs since clearly a hyp erb olic

IFS is weakly hyp erb olic One similarly denes a weakly hyperbolic IFS with

probabilities and a weakly hyperbolic recurrent IFS

Example The IFS f f f g with f f dened by

x x

f x

0 0

and f x f x is not hyp erb olic since f f but can b e

shown to b e weakly hyp erb olic

Since for a weakly hyp erb olic IFS the map F HX HX is not

necessarily contracting one needs a dierent approach to prove the existence

and uniqueness of the attractor in this more general setting In this pap er we

will use the domaintheoretic mo del to extend the results of Hutchinson those

of Hepting et al and those in mentioned ab ove to weakly hyp erb olic IFSs

and weakly hyp erb olic IFSs with probabilities We will then prove the existence

and uniqueness of the invariant distribution of a weakly hyp erb olic recurrent

IFS and obtain a nite algorithm to generate this invariant distribution on

a digitised screen We also deduce a formula for the exp ected value of an

almost continuous function with resp ect to this distribution and also a simple

expression for the exp ected value of any Lipschitz map up to any given

threshold of accuracy with resp ect to the invariant distribution of a hyp erb olic

recurrent IFS

The domaintheoretic framework of IFS we will show has the unifying

feature that several asp ects of the theory of IFS namely a the pro of of

existence and uniqueness of the attractor of a weakly hyp erb olic IFS and

that of the invariant measure of a weakly hyp erb olic IFS with probabilities

or recurrent IFS b the nite algorithms to approximate the attractor and

the invariant measures c the complexity analyses of these algorithms and

d the computation of the exp ected value of almost everywhere continuous

functions or Lipschitz functions with resp ect to these invariant measures are

all integrated uniformly within the domaintheoretic mo del

Notation and Terminology

We recall the basic denitions in the theory of continuous p osets p osetpartially

ordered set

A nonempty subset A P of a p oset P v is directed if for any pair of

elements x y A there is z A with x y v z A directed complete partial

order dcpo is a partial order in which every directed subset A has a least

F

upp er b ound lub denoted by A

An op en set O D of the Scott top ology of a dcp o is a set which is

upward closed ie x O x v y y O and is inaccessible by lubs of

F

directed sets ie A O x A x O It can b e shown that a function

f D E from a dcp o D to another one E is continuous with resp ect to the

Scott top ology i it is monotone ie x v y f x v f y and preserves lubs

F F

of directed sets ie f x f x where fx j i I g is any directed

i i i

i2I i2I

subset of D From this it follows that a continuous function f D D on

a dcp o D with least element or b ottom has a least xed point given by

F

n

f

n

Given two elements x y in a dcp o D we say x is waybelow y denoted

F

by x y if whenever y v A for a directed set A then there is a A with

x v a We say that a subset B D is a basis for D if for each d D the

F

set A of elements of B wayb elow d is directed and d A We say D is

continuous if it has a basis it is continuous if it has a countable basis

The pro duct of continuous dcp os is an continuous dcp o in which the

Scott top ology and the pro duct top ology coincide An algebraic dcp o is an

continuous dcp o with a countable basis B satisfying b b for all b B

For any map f D E any p oint x D any subset A D and any

subset B E we denote whenever more convenient the image of x by f x

instead of f x the forward image of A by f A instead of f A and the

preimage of B by f B instead of f B The lattice of op en sets of a

top ological space X is denoted by X For a compact metric space X

c

we denote by M X c the set of all Borel measures on X with

X c

A Domaintheoretic Mo del

We start by presenting the domaintheoretic framework for studying IFSs

The Upp er Space

Let X b e a compact space The upper space UX of X consists of all

nonempty compact subsets of X ordered by reverse inclusion We recall the

following prop erties of the upp er space for example from The partial

order UX is a continuous dcp o with a b ottom element namely X in

which the least upp er b ound lub of a directed set of compact subsets is their

intersection The wayb elow relation B C holds if and only if B contains

a neighb ourho o d of C The Scott top ology on UX has a basis given by the

collections 2a fC UX j C ag a X The singleton map

s X UX

x fxg

emb eds X onto the set sX of maximal elements of UX Any continuous

map f X Y of compact metric spaces induces a Scottcontinuous map

Uf UX UY dened by Uf C f C to keep the notations simple we

will write Uf simply as f If X is in fact a compact metric space then

UX is an continuous dcp o and has a countable basis consisting of

nite unions of closures of relatively compact op en sets of X Note that the

two top ological spaces UX and HX d have the same elements the

H

nonempty compact subsets but dierent top ologies

Hayashi used the upp er space to note the following result

Prop osition If fX f f f g is an IFS on a compact space X

N

then the map

F UX UX

A f A f A f A

N

is Scottcontinuous and has therefore a least xed point namely

G

 n n

A F X F X

n n

For convenience we use the same notation for the map F HX HX as

in Equation and the map F UX UX ab ove as they are dened in



exactly the same way Since the ordering in UX is reverse inclusion A is the

 

largest compact subset of X with F A A However in order to obtain

a satisfactory result on the uniqueness of this xed p oint and in order to

formulate a suitable theory of IFS with probabilities we need to assume that

X is a metric space

On the other hand if X is a lo cally compact complete metric space

and fX f f g an hyp erb olic IFS then there exists a nonempty regular

N

 

compact set A such that F A f A f A f A A where A is

N

the interior of A see Lemma The unique attractor of the IFS will

then lie in A and therefore we can simply work with the IFS fA f f g

N

n

In particular if X is R with the Euclidean metric and s s is the

i i

contractivity factor of f i N then it is easy to check that we have

i

F A A where A is any closed ball of radius R centred at the origin O with

dO f O

i

R max

i

s

i

where d is the Euclidean metric Therefore as far as an hyp erb olic IFS on

a lo cally compact complete metric space is concerned there is no loss of

generality if we assume that the underlying space X is a compact metric space

We will make this assumption from now on

1

A regular closed set is one which is equal to the closure of its interior

Let X b e a compact metric space and let fX f f g b e an IFS The

N

IFS generates a nitely branching tree as in Figure which we call the IFS

tree Note that each no de is a subset of its parent no de and therefore the

diameters of the no des decrease along each innite branch of the tree The

IFS tree plays a fundamental role in the domaintheoretic framework for IFSs

As we will see all the results in this pap er are based on various prop erties

of this tree these include the existence and uniqueness of the attractor of a

weakly hyp erb olic IFS the algorithm to obtain a discrete approximation to the

attractor the existence and uniqueness of the invariant measure of a weakly

hyp erb olic IFS with probabilities the algorithm to generate this measure on

a digitised screen the corresp onding results for the recurrent IFSs and the

formula for the exp ected value of an almost everywhere continuous function

with resp ect to the invariant distribution of a weakly hyp erb olic recurrent IFS

X

f X f X .. f X 1 2 N

f f X.. f f X ...... f f X .. f f X

1 1 1 N N 1 N N

Figure The IFS tree

We will now use this tree to obtain some equivalent characterisations of a

weakly hyp erb olic IFS as dened in Denition

Prop osition For an IFS fX f f g on a compact metric space X

N

the fol lowing are equivalent

i The IFS is weakly hyperbolic

T

f ii For each innite sequence i i the intersection f f X

i i i

1 2 n

n

is a singleton set

iii For al l there exists n such that jf f f X j for al l nite

i i i

1 2 n

n

sequences i i i of length n

n

Pro of The implications i ii and also iii i are all straightforward

It remains to show i iii Assume that the IFS do es not satisfy iii

Then there exists such that for all n there is a no de on level

n of the IFS tree with diameter at least Since the parent of any such

no de will also have diameter at least we obtain a nitely branching innite

subtree all whose no des have diameter at least By Konigs lemma this

subtree will have an innite branch hf f f X i Therefore the sequence

i i i n

1 2 n

hjf f f X ji do es not converge to zero as n and the IFS is not

i i i n

1 2 n

weakly hyp erb olic 

Corollary If the IFS is weakly hyperbolic then for any sequence

xi converges for any x X and the f f i i the sequence hf

nO i i i

n 2 1

limit is independent of x Moreover the mapping

X

i i lim f f f x

n!1 i i i

1 2 n

T

n 

F X is continuous and its image is A

n

An IFS fX f f g also generates another nitely branching tree as in

N

Figure which we call the action tree Here a child of a no de is the image

of the no de under the action of some f i X

f X f X .. f X 1 2 N

f f X.. f f X ...... f f X .. f f X

1 1 N 1 1 N NN

Figure The action tree

Note that the IFS tree and the action tree have the same set of no des on any

level n

Corollary If the IFS is weakly hyperbolic lim jf f f X j

n!1 i i i

n n1 1

for al l innite sequences i i

Conversely we have the following

Prop osition If each mapping f in an IFS is nonexpansive and

i

lim jf f f X j for al l innite sequences i i then the

n!1 i i i

n n1 1

IFS is weakly hyperbolic

Pro of Assume that that the IFS is not weakly hyp erb olic Then by condition

iii of Prop osition there exists such that for each n there is

X on level n of the action tree with diameter at least f f a no de f

i i i

1 n1 n

is nonexpansive it follows that the parent no de Since by assumption f

i

n

X has diameter at least We then have a nitely branching innite f f

i i

1 n1

subtree with no des of diameter at least Therefore by Konigs lemma the

action tree has an innite branch with no des of diameter at least which

gives a contradiction 

By Prop osition we already know that a weakly hyp erb olic IFS has a xed

F T

 n n n

p oint given by A F X F X Note that F X is the union of

n n



the no des of the IFS tree on level n and that A is the set of lubs of all

innite branches of this tree Such a set is an example of a nitely generable

subset of the continuous dcp o UX as it is obtained from a nitely branching

tree of elements of UX This gives us the motivation to study the Plotkin

power domain of UX which can b e presented precisely by the set of nitely

generable subsets of UX We will then use the Plotkin p ower domain to prove

the uniqueness of the xed p oint of a weakly hyp erb olic IFS and deduce its

other prop erties

Finitely Generable Sets

The following construction of the Plotkin p ower domain of an continuous

dcp o and the subsequent prop erties are a straightforward generalisation of

those for an algebraic cp o presented in Supp ose D v is any

continuous dcp o and B D a countable basis for it Consider any nitely

branching tree whose branches are all innite and whose no des are elements

of D and each child y of any parent no de x satises x v y The set of lubs

of all branches of the tree is called an nitely generable subset of D It can

b e shown that any nitely generable subset of D can also b e generated in the

ab ove way by a nitely branching tree of elements of the basis B such that

each no de is wayb elow its parents We denote the set of nitely generable

subsets of D by F D It is easily seen that P B P D F D where

f f

P S denotes the set of all nite subsets of the set S For A P B and

f f

C F D the EgliMilner order v is dened by A v C i

EM EM

a A c C a v c c C a A a v c

It is easily seen that v is a preorder reexive and transitive but not

EM

necessarily antisymmetric It can b e extended to a preorder on F D by

dening C v C i for all A P B whenever A v C holds we

EM f EM

have A v C Then F D v b ecomes an continuous dcp o except

EM EM

that v is a preorder rather than a partial order A basis is given by

EM

P D v and a countable basis by P B v The Plotkin power

f EM f EM

domain or the convex power domain CD of D is then dened to b e the

quotient F D v on F D is given where the equivalence relation

EM





by C C i C v C and C v C If A F D and A consists of

EM EM

maximal elements of D then A will b e a maximal element of F D v

EM

and its equivalence class will consist of A only If D has a b ottom element

then F D v has a b ottom element namely fg and its equivalence

EM

class consists of itself only Finally we note that for any dcp o E any

monotone map g P D E has a unique extension to a Scottcontinuous

f

map g CD E which for convenience we denote by g

Now let D b e UX where X is as b efore a compact metric space and

fX f f g an IFS Let F UX UX b e as b efore and consider the

N

Scottcontinuous map f CUX CUX which is dened on the basis P UX

f

by the monotone map

f P UX CUX

f

fA j j M g ff A j j M i N g

j i j

n

The set of no des at level n of the IFS tree is then represented by f fX g We

also consider the Scottcontinuous map U CUX UX dened on the ab ove

basis by the monotone map

U P UX CUX

f

S

fA j j M g A

j j

j N

The following prop erties were shown in for the sake of completeness we

reiterate them here in the context of our presentation of the Plotkin p ower

domain in terms of nitely generable subsets The diagram

U



CUX X

F f

? ?

U



UX CUX

commutes which can b e easily seen by considering the restriction to the basis

P UX It follows that U maps any xed p oint of f to a xed p oint of F

f

Moreover it maps the least xed p oint of f to the least xed p oint of F

n n n

since for each n Uf fX g F UfX g F X and therefore

G G G

n n n

U f fX g Uf fX g F X

n n n

On the other hand for A UX let

SA fsx j x Ag ffxg j x Ag UX

It is easy to see that SA is a nitely generable subset of UX This can b e

shown for example by constructing a nitely branching tree such that the set

of no des at level n consists of the closure of op en subsets with diameters

n

less than It follows that SA is an element of CUX and by the

ab ove remark it is a maximal element Furthermore the Scottcontinuity of f

implies that the following diagram commutes

S

-

UX CUX

F f

? ?

S

-

CUX UX

Therefore S maps any xed p oint of F to a xed p oint of f Note also that

S is onetoone

Prop osition If the IFS fX f f g is weakly hyperbolic then the

N

two maps F UX UX and f CUX CUX have unique xed points

T

 n 

A F X and SA respectively

n

Pro of For each n we have

n n n

f fX g ff f f X j i i i g SF X

i i i n

1 2 n

It follows that the least xed p oint of f is given by

F

n 1

f fX g flim f f f X j i i g Since the IFS is

n!1 i i i

1 2 n

n

weakly hyp erb olic this set consists of singleton sets in fact we have

T F

n   n

F X SA But SA is maximal in CUX so this least f fX g S

n n

xed p oint is indeed the unique xed p oint of f On the other hand since S

is onetoone and takes any xed p oint of F to a xed p oint of f it follows



that A is the unique xed p oint of F 

In order to get the generalisation of Equation we need the following lemma

whose straightforward pro of is omitted

Lemma Let fB j i M g fC j i M g fD j i M g be

i i i

three nite col lections of nonempty compact subset of the metric space X If

S S

D  C C D B and jB j for i M then d

i i i i i i H

i i

Theorem If the IFS fX f f g is weakly hyperbolic then the map

N



F HX HX has a unique xed point A the attractor of the IFS Moreover

n 

for any A HX we have F A A in the Hausdor metric as n

Pro of Since the set of xed p oints of F HX HX is precisely the

set of xed p oints of F UX UX the rst part follows immediately

T

n 

F X is indeed the unique xed from Prop osition and A

n

p oint of F HX HX Let A X b e any nonempty compact

subset and let b e given By Prop osition iii there exists

m such that for all n m the diameters of all the subsets in the

n n

collection f fX g ff f f X j i i i g are less that Clearly

i i i n

1 2 n



f f f A f f f X and A f f f X f f f X for all

i i i i i i i i i i i i

1 2 n 1 2 n 1 2 n 1 2 n

n n 

i i i Therefore by the lemma d F A A 

n H

Plotkin Power Domain Algorithm

Given a weakly hyp erb olic IFS fX f f g we want to formulate an

N

algorithm to obtain a nite subset A of X which approximates the attractor



A of the IFS up to a given threshold with resp ect to the Hausdor

metric

We will make the assumption that for each no de of the IFS tree it is

decidable whether or not the diameter of the no de is less than For an

hyp erb olic IFS we have

n

jf f f j s s s jX j

i i i i i i

1 2 n 1 2 n

where s is the contractivity factor of f and therefore the ab ove relation is

i i

clearly decidable However there are other interesting cases in applications

n n

where this relation is also decidable For example if X R and if

n n

for every i each of the coordinates of the map f is say

i

monotonically increasing in each of its arguments then the diameter of any

no de is easily computed as

n

f f f j df f jf

i i i i i i

n 1 n 1 n 1

where d is the Eucleadian distance It is then clear that the ab ove relation is

decidable in this case

Let b e given and x x X We construct a nite subtree of

the IFS tree as follows For any innite sequence i i the sequence

X ji is decreasing and tends to zero and therefore there is f f hjf

n i i i

n 2 1

X j We truncate the innite f f a least integer m such that jf

i i i

m 2 1

branch hf f f X i of the IFS tree at the no de f f f X which is

i i i n i i i

1 2 n 1 2 m

then a leaf of the truncated tree as depicted in Figure and which contains

the distinguished p oint f f f x f f f X

i i i i i i

1 2 m 1 2 m

X

f X

i

1

f f X

i i

1 2

f f f X

i i i

1 2 m

Figure A branch of the truncated IFS tree

By Prop osition the truncated tree will have nite depth Let L denote

the set of all leaves of this nite tree and let A X b e the set of all

distinguished p oints of the leaves For each leaf l L the attractor satises

S

  

l l A and A l A On the other hand for each leaf l L

l2L

S

we have l A and A l A It follows by Lemma that

l2L



d A A The algorithm therefore traverses the IFS tree in some sp ecic

H

order to obtain the set of leaves L and hence the nite set A which is the

required discrete approximation



For an hyp erb olic IFS and for X A this algorithm reduces to that of

Hepting et al We will here obtain an upp er b ound for the complexity

of the algorithm when the maps f are contracting ane transformations as

i

this is always the case in image compression First we note that there is a

simple formula for the contractivity of an ane map In fact supp ose the map

f R R is given at the p oint z R in matrix notation by z W z t

where the matrix W is the linear part and t R is the translation part

of f Then the inmum of numb ers c with

0 0

jf z f z j cjz z j

t t

is the greatest eigenvalue in absolute value of the matrix W W where W is

the transp ose of W This greatest eigenvalue is easily calculated for the

matrix

a b

W

c d

to b e given by

q

p

where a c b d and ab cd If f is contracting

then this numb er is strictly less than one and is the contractivity of f

While traversing the tree the algorithm recursively computes f f f x

i i i

1 2 n

and s s s jX j and if s s s jX j then the p oint f f f x is

i i i i i i i i i

1 2 n 1 2 n 1 2 n

taken to b elong to A An upp er b ound for the height of the truncated tree

n

s where s max s s s is obtained as follows We have s

iN i i i i

n 2 1

h

is the contractivity of the IFS Therefore the least integer h with s jX j is

an upp er b ound ie h dlogjX j log se where dae is the least nonnegative

integer greater than or equal to a A simple counting shows that there are at

most arithmetic computations at each no de Therefore the total numb er of

h h

computations is at most N N N N N N which

h

is O N

Next we consider the problem of plotting on the computer screen the

discrete approximation to the attractor of a weakly hyp erb olic IFS in R

The digitisation of the discrete approximation A inevitably pro duces a further



error in approximating the attractor A We will obtain a b ound for this

error Supp ose we have a weakly hyp erb olic IFS fX f f g with X R

N

By a translation of origin and rescaling if necessary we can assume that

X Supp ose furthermore that the computer screen with

resolution r r is represented by the unit square digitised into a

twodimensional array of r r pixels We regard each pixel as a p oint so that

the distance b etween nearest pixels is given by r We assume that

for each p oint in A the nearest pixel is plotted on the screen Let A b e the

p

from set of pixels plotted Since any p oint in is at most

p

its nearest pixel it is easy to see that d A A It follows that

H

p

 

d A A d A A d A A

H H H

In the worst case the error in the digitisation pro cess for a given resolution

p

whatever the value of the discrete r r of the screen is at least

threshold On the other hand even in the case of an hyp erb olic IFS

log

the complexity of the algorithm grows as N as In practice the

optimal balance b etween accuracy and complexity is reached by taking to b e

of the order of r

Invariant Measure of an IFS with Probabilities

We prove the existence and uniqueness of the invariant measure of a weakly

hyp erb olic IFS with probabilities by generalising the corresp onding result for

an hyp erb olic IFS in which is based on the normalised probabilistic p ower

domain We rst recall the basic denitions

Probabilistic p ower domain

A valuation on a top ological space Y is a map Y which

satises

i a b a b a b

ii and

iii a b a b

A continuous valuation is a valuation such that whenever A Y

is a directed set wrt of op en sets of Y then

O sup O

O 2A

O 2A

For any b Y the point valuation based at b is the valuation

Y dened by

b

if b O

O

b

otherwise

Any nite linear combination

n

X

r

i b

i

i

of p oint valuations with constant co ecients r i n is a

b i

i

continuous valuation on Y we call it a simple valuation

The normalised probabilistic power domain P Y of a top ological space Y

consists of the set of continuous valuations on Y with Y and is

ordered as follows

v i for all op en sets O of Y O O

The partial order P Y v is a dcp o with b ottom in which the lub of a

F

directed set h i is given by where for O Y we have

i i2I i

i

O sup O

i

i2I

Moreover if Y is an continuous dcp o with a b ottom element then

P Y is also an continuous dcp o with a b ottom element and has a

?

basis consisting of simple valuations Therefore any P Y

is the lub of an chain of normalised simple valuations and hence by a

lemma of SahebDjahromi can b e uniquely extended to a Borel measure

on Y which we denote for convenience by as well page For

c

c let P Y denote the dcp o of valuations with total mass c ie

c c

P Y f PY j Y cg Since P Y is obtained from P Y by a simple

rescaling it shares the ab ove prop erties of P Y the case c is of course

trivial

For two simple valuations

X X

r s

b b c c

b2B c2C

in P Y where B C P Y we have by the splitting lemma v

f

i for all b B and all c C there exists a nonnegative numb er t such that

bc

X X

t r t s

bc b bc c

c2C b2B

and t implies b v c We can consider any b B as a source with mass

bc

r any c C as a sink with mass s and the numb er t as the ow of mass

b c bc

from b to c Then the ab ove prop erty can b e regarded as conservation of

total mass

Mo del for IFS with probabilities

Now let X b e a compact metric space so that UX is an continuous dcp o

with b ottom X Therefore P UX is an continuous dcp o with b ottom

X

Recall that the singleton map s X UX with sx fxg emb eds X onto

the set sX of maximal elements of UX For any op en subset a X the

set sa ffxg j x ag UX is a G subset and hence a Borel subset

Corollary A valuation P UX is said to b e supported in sX if

UX n sX If is supp orted in sX then the support of is the set

of p oints y sX such that O for any Scottneighb ourho o d O UX

of y Any element of P UX which is supp orted in sX is a maximal element

of P UX Prop osition we denote the set of all valuations which

are supp orted in sX by S X We can identify S X with the set M X of

normalised Borel measures on X as follows Let

e M X S X

s

and

j S X M X

s

Theorem Theorem The maps e and j are wel ldened and

induce an isomorphism between S X and M X 

For M X and an op en subset a X

a esa e2a

Let fX f f p p g b e an IFS with probabilities on the compact

N N

metric space X Dene

H P UX P UX

H

P

N

p f O Note that H is dened in the same way as by H O

i

i

i

the Markov op erator T in Equation Then H is Scottcontinuous and has

F

m 

H where therefore a least xed p oint given by

X

m

N

X

m

p p p H

f i i i f X X f

i m 2 1 i i

m

1 2

i i i

1 2 m

Furthermore we have



Theorem For a weakly hyperbolic IFS the least xed point of H is in

S X Hence it is a maximal element of P UX and therefore the unique xed

   

point of H The support of is given by SA ffxg j x A g where A X

is the attractor of the IFS 

 

Pro of To show that S X it is sucient to show that sX

For each integer k let hb i b e the collection of all op en balls

i i2I

k

S

2b Then hO i is a Let O b X of radius less than

i k k  k i

i2I

k

k

T

decreasing sequence of op en subsets of UX and sX O Therefore

k

i

 

sX inf O By Prop osition iii for each k there exists

k  k

some integer n such that all the no des of the IFS tree on level n have

m

diameter strictly less than k Hence for all nite sequences i i

m

X O Therefore for all m n f f with m n we have f

k i i i

m 2 1

X

m

H O p p p

X k i i i

1 2 m

f f X 2O f

i i k i

m

1 2

 m

It follows that O sup H O and therefore

k X k

m

  

sX inf O as required To show that SA is the

k  k

 

supp ort of let x A and for any integer k let B x X b e the

k

op en ball of radius k centred at x Then fxg is the lub of some innite

T

X for some i i As in f f branch of the IFS tree fxg

i i

n 1

n

the ab ove let n b e such that the diameters of all no des of the IFS tree

on level n are strictly less than k Then

 m n

2B x sup H 2B x H 2B x p p

k X k X k i i

m 1 n

Since h2B xi is a neighb ourho o d basis of fxg in UX it follows that fxg

k k 

 

is in the supp ort of On the other hand if x A then there is an op en



ball B x X which do es not intersect A Let n b e such that the no des

on level n of the IFS tree have diameters strictly less than Then for all

m

m n we have H B x and it follows that

X



B x sup B x

m



and fxg is not in the supp ort of 

Corollary For a weakly hyperbolic IFS the normalised measure

 

j M X is the unique xed point of the Markov operator



T M X M X Its support is the unique attractor A of the IFS 

Probabilistic p ower domain algorithm

Since the Plotkin p ower domain algorithm in Section provides a digitised



discrete approximation A to the attractor A the question is how to render



to obtain an approximation to the invariant measure the pixels in A

We now describ e an algorithm to do this which extends that of Hepting

et al for an hyp erb olic IFS with probabilities Assume again that

the unit square represents the digitised screen with r r pixels Supp ose

fX f f p p g is a weakly hyp erb olic IFS with X

N N

m

and is the discrete threshold Fix x X The simple valuation H

X

th

of Equation can b e depicted by the m level of the IFS tree lab elled with

transitional probabilities as in Figure

X

p p p 1 2 N

f X f X .. f X 1 2 N

p p p p 1 N 1 N

f f X.. f f X ...... f f X .. f f X

1 1 1 N N 1 N N

Figure The IFS tree with transitional probabilities

The ro ot X of the tree has mass one and represents Any edge going

X

from a no de tX where t f f f is a nite comp osition of the

i i i

1 2 m

maps f to its child tf X is lab elled with transitional probability p for

i i i

i N The transitional probability lab el on each edge gives the ow of

mass from the parent no de source to the child no de sink in the sense of

Equation in the splitting lemma The total mass of the no de f f f X

i i i

1 2 m

on level m is therefore the pro duct p p p of the lab els of all the edges

i i i

1 2 m

m

leading from the ro ot to the no de in agreement with the expansion of H

X

in Equation We again make the assumption that it is decidable that the

diameter of any no de is less than or not The algorithm then pro ceeds as

in the deterministic case to nd all the leaves of the IFS tree and this time

computes the mass of each leaf The set of all weighted leaves of the truncated

IFS tree represents a simple valuation which is a discrete approximation to the



invariant measure Then the total mass given to each pixel in A is the

sum of the masses of all leaves corresp onding to that pixel

In the hyp erb olic case the probabilistic algorithm traverses the nite tree

jX j and s s and s p p x p f f and recursively computes f

i i i i i i i i i

n 2 1 n 2 1 n 2 1

x is f f jX j then the weight of the pixel for f s s if s

i i i i i i

n 2 1 n 2 1

A simple counting shows that this takes at most p p incremented by p

i i i

n 2 1

arithmetic computations at each no de Therefore the total numb er of

h h

computations is at most N N N N which is O N as b efore

A Mo del for Recurrent IFSs

In this section we will construct a domaintheoretic mo del for weakly hyp erb olic

recurrent IFSs Assume that fX f f g is an IFS and p i j N

N ij

is an indecomp osable rowsto chastic matrix Then fX f p i j N g

j ij

is a recurrent IFS We will see b elow that this gives rise to a

on the copro duct of N copies of X See for an intro duction to Markov

chains

P

N

For a top ological space Y we let Y Y fj g denote the copro duct

j

disjoint sum of N copies of Y ie

N

X

Y Y fj g fy j j y Y j N g

j

N

Y Y and its Borel subsets with its frame of op en sets given by

N

by B Y B Y where B Y is the set of Borel subsets of Y

Any normalised Borel measure M Y is a mapping

N

B Y

c

j

which can b e written as with M Y

j j N j

P

N

for some c c and c such that for

j j j

j

P

N

N

B B B B B B Y we have B B

j j N j j

j

The Generalised Markov Op erator

P

N

A recurrent IFS induces a Markov pro cess on X X fj g as follows

j

Let i i i b e a Markov chain on f N g with transition probability

matrix p Let x X and consider the pro cess

ij

Z x Z f Z

n i n

n

which gives us the random walk describ ed in Subsection Then Z i is

n n

P

N

X X fj g with the Markov transition probability a Markov pro cess on

j

function

N

X

f x j B p K x i

j ij

B

j

B X here which is the probability of transition from x i into the Borel set

is the characteristic function of the set B This transitional probability

B

induces the generalised Markov op erator dened by

T M X M X

T

with

Z Z

N

X

f x j d p K x i T B B d

j ij

B

X X

j

Z Z

N N N N

X X X X

f x d p f x d p

j i ij B j i B ij

j j

X X

i j i j

N N

X X

B p f

j ij i

j

i j

In other words

N

X

p f T

ij i j

j

i

T is welldened since Note that

N N N N N

X X X X X

T X p f X p X X

ij i ij i i

j

j i i j i

P

N

since p for i N For an hyp erb olic recurrent IFS Barnsley

ij

j

r on M X by denes the generalised Hutchinson metric

H

Z Z

N

X

f d r supf f d j f X R jf xf y j dx y i N g

i i H i i i i i

X X

i

and then states in page that one exp ects the generalised Markov

op erator to b e a contracting map and therefore to have a unique xed

p oint But he notes that the contractivity factor will dep end not only on the

contractivity of the IFS but also on the matrix p However no pro of is

ij

T is indeed contracting for a given p subsequently the existence given that

ij

and uniqueness of a xed p oint is not veried On the other hand it is shown

in Theorem by proving the convergence in distribution of the exp ected

value of realvalued functions on X that a hyp erb olic recurrent IFS do es have

a unique stationary distribution We will show here more generally that for a

weakly hyp erb olic recurrent IFS the generalised Markov op erator has indeed a

unique xed p oint

The Unique Fixed Point of the Markov Op erator

We will achieve the ab ove task without any need for a metric by extending

P

N

UX where UX UX fj g is the the generalised Markov op erator to P

j

copro duct of N copies of UX

Y is a mapping If Y is a top ological space a valuation P

Y

c

j

Y with P which can b e written as

j j N j

P

N

for some c c and c such that for

j j j

j

P

N

N

O O O O O Y we have O O page

j j N j j

j

UX which is dened b elow We will work in a sub dcp o of P

Note that our assumptions imply that p is the transitional matrix of an

ij

ergo dic nite Markov chain and therefore we have

Prop osition page There exists a unique probability vector m

j

P P

N N

with m j N and m which satises m m p 

j j j i ij

j i

Let P UX b e given by m m m where m

X X N X j

j N is the unique probability vector in Prop osition Put

P UX f P UX j v g

Note that for PUX we have P UX i UX m

N j j

for j N since v i m UX and we have

j j

N N

X X

UX m UX

j j

j j

Q

N

m

j

UX P UX which implies UX m It also follows that P

j j

j

UX is an continuous dcp o with b ottom and any Therefore P

P UX extends uniquely to a Borel measure on P UX as each

j j j

extends uniquely to a Borel measure on P UX

Let

s X UX

x j fxg j

b e the emb edding of X onto the set of maximal elements of

UX Any Borel subset B B of X induces a Borel subset

j j

sB sB of UX since each sB is a Borel subset of UX Let

j j j

X f M X j X m j N g and let M

j j j j

S UX f P UX j s X g

and dene the two maps

e M X S UX and S UX M X

s s

We then have the following generalisation of Theorem

Theorem The two maps e and are wel ldened and give an isomorphism

between M X and S UX 

Given a recurrent IFS fX f p i j N g we extend the generalised

i ij

UX by Markov op erator on P

H P UX P UX

H

P P

N N

H O p f O in other words we have where

ij i j

j

j i

P

N

H p f as in the denition of T in Equation

j ij i

j

i

If P UX then H P UX since

N N N

X X X

H UX p f UX p UX p m m

j ij i ij i ij i j

j

i i i

H respectively T is in P UX respectively Prop osition Any xed point of

M X

Pro of Let P UX b e a xed p oint of H Then for each

N

j f N g we have

H UX UX

j j

N

X

p f UX

ij i

j

i

N

X

p UX

ij i

i

T is By Prop osition we have UX m as required The pro of for

j j

similar 

The following lemma shows that for any recurrent IFS the generalised Markov

op erator has a least xed p oint

H P UX P UX is Scottcontinuous Lemma The mapping

Pro of It is immediately seen from the denition that H is monotone Let

k

h i b e an increasing chain in P UX Then for any O O UX

k  j j

we have

N N

X X G

F

k

k

p H O f O

ij j

i

k j

i j

k

N N

X X

k

f O p sup

j ij

i k

j

j i

N N

X X

k

O f sup p

j ij

i k

j

i j

k

sup H O

k

F

k

H O

k

H follows  The Scottcontinuity of

F

n



Let us nd an explicit formula for the least xed p oint H of

n

H It is convenient to use the inverse transitional probability matrix page

q which is dened as follows

ij

m

j

p q

j i ij

m

i

Note that by Prop osition m for j N and therefore q is

j ij

P

N

welldened it is again rowsto chastic irreducible and satises m q m

i ij j

i

for j N We can now show by induction that

N

X

n

m q q H q

j j i i i j i i f f f X f

1 1 2 n2 n1 j i i i

1 2

n1

i i i

1 2 n1

In fact

P

N

p m f H

ij i X j

j

i

P

N

p m

ij i f X

j

i

m

j f X

j

Assuming the result holds for n we have

n n

H H H

j j

N N

X X

p m q q f

ij i ii i i X f f f

1 n2 n1 i i i

j

1

n1

i i i

1 n1

N

X

m p q q

i ij ii i i X f f f f

1 n2 n1 i j i i

1

n1

ii i

1 n1

N

X

m q q q

j j i ii i i X f f f f

1 n2 n1 i j i i

1

n1

ii i

1 n1

as required

Theorem For a weakly hyperbolic recurrent IFS the extended generalised



Markov operator H P UX P UX has a unique xed point S UX with

 

support SA f X where A is the unique attractor of the IFS

j j

Pro of We know by Prop osition that any xed p oint of H is in P UX



Therefore it is sucient to show that the least xed p oint of

H P UX P UX



is unique Using the explicit form of in Equation we can show

as in the corresp onding pro of for a weakly hyp erb olic IFS with probabilities

 

Theorem that S UX It then follows that is maximal in P UX



and hence is the unique xed p oint By Equation the supp ort of is



indeed SA f X 

j j



It then follows similar to the case of an IFS with probabilities that



is the unique stationary distribution of the generalised Markov op erator





T M X M X of Subsection and that the supp ort of is A f X

j j

The Recurrent Probabilistic Power Domain Algorithm

Theorem provides us with the recurrent algorithm to generate the stationary

distribution of a recurrent IFS on the digitised screen Given the recurrent

IFS fX f p i j N g where X is contained in the the unit square

j ij

consider the recurrent IFS tree with transitional probabilities in Figure Let

b e the discrete threshold

XX . . . . X

m m m 1 2 . . . . N

f X f X . . . . f X 1 2 N

q q q q 11 1N N1 NN

f f X.. f f X ...... f f X .. f f X

1 1 1 N N 1 NN

Figure The recurrent IFS tree with transitional probabilities

Initially the set X fj g is given mass m which is then distributed amongst the

j

no des of the tree according to the inverse transitional probability matrix q

ij

The algorithm rst computes the unique stationary initial distribution m by

i

P

N

m p for m j N with the Gaussian solving the equations m

i ij j j

i

elimination metho d and determines the inverse transition probability matrix

q given by Equation The numb er of arithmetic computations for this

ij

is O N Then the algorithm pro ceeds exactly as the probabilistic algorithm

to compute for each pixel the sum of the weights m q q of the

i i i i i

1 1 2 n1 n

X of the IFS tree which o ccupy that pixel The numb er of f f leaves f

i i i

n 2 1

h

computations for the latter is O N as b efore where h dlogjX j log se

0

h 0

Therefore the complexity of the algorithm is O N where h maxh

The Exp ected Value of Continuous Functions

In this section we will use the theory of generalised Riemann integration

develop ed in to obtain the exp ected value of a continuous realvalued

function with resp ect to the stationary distribution of a recurrent IFS We rst

recall the basic notions from the ab ove work

Let X b e a compact metric space and g X R b e a b ounded realvalued

function which is continuous almost everywhere with resp ect to a given

normalised Borel measure on X By Theorem g will b e

R

Rintegrable and by Theorem its Rintegral R g d coincides with

R

its Leb esgue integral L g d The Rintegral can b e computed as follows We

know that corresp onds to a unique valuation e s S X P UX

P

r P UX which is supp orted in sX For any simple valuation

b b

b2B

the lower sum of g with resp ect to is

X

S g r inf g b

b

b2B

Similarly the upp er sum of g with resp ect to is

X

u

S g r sup g b

b

b2B

F

Let h i b e an chain of simple valuations in P UX with e

n n n

n

Then it follows from Corollary that S g is an increasing sequence

n

X

R

g is a decreasing sequence with limit of n with limit R g d and S

n

X

R

R g d We can also compute the Rintegral by generalised Riemann sums as

follows For each n assume B is a nite subset of UX and for each

n

P

r Let b B r with

nb n nb

b2B

n

X

r

n nb b

b2B

n

and b for b B and n Put

nb n

X

S g r g

n nb nb

b2B

n

u

Then we have S g S g S g and therefore

n n n

Z

lim S g R g d

n

n!1

Now consider a weakly hyp erb olic recurrent IFS fX f p i j N g

j ij

We would like to compute

Z Z

N

X

 

g d g d

j

j





is the unique stationary distribution and g X R is where

j

j

a b ounded realvalued function which is continuous almost everywhere with

F

 



resp ect to each comp onent e where We know that

n

j

n

n n

H and each comp onent H is given by Equation Fix

n j

an arbitrary p oint x X and for each comp onent j N select

X and dene the Riemann sum for the j f x f f f f f

i j i i j i

n1 1 n1 1

comp onent by

N

X

S g m q q q g f f f x

j n j j i i i i i j i i

1 1 2 n2 n1 1 n1

i i i

1 2 n1

P

N

S g Then by Equations and we have Put S g

n j n

j

Z



R g d lim S g

n

n!1

For an IFS with probabilities we have p p for i j N which

ij j



implies m p and q p for all i j the invariant measure of the IFS

j j ij j



with probabilities can b e expressed in terms of the stationary distribution

P

N

 

and we obtain of the recurrent IFS by

j

j

Z

N

X



R g d lim p p g f f x

i i i i

1 n 1 n

n!1

i i i

1 2 n

Compare this formula with Eltons ergo dic formula in Equation which

converges with probability one For an hyp erb olic IFS and a continuous

function g the ab ove formula reduces to that of Hepting et al in

For an hyp erb olic recurrent IFS and a Lipschitz map g we can do b etter

we can obtain a p olynomial algorithm to calculate the integral to any given

accuracy Supp ose there exist k and c such that g satises

k

jg x g y j cdx y

for all x y X Let b e given Then we have jg x g y j

k k

if dx y c Put n dlog c jX j log se where s is the

contractivity of the IFS Then the diameter of the subset f f X is at

i i

1 n

n k n

most s jX j c for all sequences i i i and hence the variation

n

of g on this subset is at most This implies that

N

X

u

X f S g X inf g f f sup g f q q m S g

i n n i i i i i i i i

n 1 n 1 n1 n 1 2 1

i i

1 n

N

X

m q q

i i i i i

1 1 2 n1 n

i i

1 n

R

 u

Since S g and g d b oth lie b etween S g and S g we conclude

n n n

that

Z



g d j jS g

n

k

with n dlogc jX j log se is the required Therefore S g

n

n

approximation and the complexity is O N

References

M F Barnsley Fractals Everywhere Academic Press second edition

M F Barnsley and S Demko Iterated function systems and the global

construction of fractals The proceedings of the Royal Society of London

A

M F Barnsley J H Elton and D P Hardin Recurrent iterated function

systems Constructive Approximation

M F Barnsley V Ervin D Hardin and J Lancaster Solution of an

inverse problem for fractals and other sets Proceedings of the National

Academy of Science April

M F Barnsley and L P Hurd Fractal Image Compression AK Peters

Ltd

M F Barnsley A Jacquin L Reuter and A D Sloan Harnessing chaos

for image synthesis Computer Graphics SIGGRAPH pro ceedings

M F Barnsley and A D Sloan A b etter way to compress images Byte

magazine pages January

U Behn J L van Hemmen R Kuhn A Lange and V A Zagrebnov

Multifractality in forgetful memories Physica D

U Behn and V Zagrebnov One dimensional Markovianeld Ising mo del

Physical prop erties and characteristics of the discrete sto chastic mapping

J Phys A Math Gen

P C Bresslo and J Stark Neural networks learning automata and

iterated function systems In A J Crilly R A Earnshaw and H Jones

editors Fractals and Chaos pages SpringerVerlag

R L Burden and J D Fairs Numerical Analysis Addison Wisley th

edition

P M Diaconis and M Shahshahani Pro ducts of random matrices and

computer image generation Contemporary

S Dubuc and A Elqortobi Approximations of fractal sets Journal of

Computational and Applied Mathematics

A Edalat Dynamical systems measures and fractals via domain theory

Extended abstract In G L Burn S J Gay and M D Ryan editors

Theory and Formal Methods SpringerVerlag Full pap er to

app ear in Information and Computation

A Edalat Domain theory and integration Extended abstract In Logic

in Computer Science IEEE Computer So ciety Press Ninth Annual

IEEE Symp osium July Paris France Full pap er to app ear in

Theoretical Computer Science

A Edalat Domain theory in learning pro cesses In S Bro okes

M Main A Melton and M Mislove editors Proceedings of the Eleventh

International Conference on Mathematical Foundations of Programming

Semantics volume one of Electronic Notes in Theoretical Computer Science

Elsevier

J Elton An ergo dic theorem for iterated maps Journal of Ergodic Theory

and Dynamical Systems

K Falconer Fractal Geometry John Wiley Sons

W Feller An Introduction to Probability Theory and Its Applications

Wiley London rd edition

Y Fisher Fractal Image Compression SpringerVerlag

S Hayashi Selfsimilar sets as Tarskis xed p oints Publications of the

Research Institute for Mathematical Sciences

J L van Hemmen G Keller and R Kuhn Forgetful memories

Europhysics Letters

D Hepting P Prusinkiewicz and D Saup e Rendering metho ds for

iterated function systems In Proceedings IFIP Fractals

J E Hutchinson Fractals and selfsimilarity Indiana University

Mathematics Journal

C Jones Probabilistic Nondeterminism PhD thesis University of

Edinburgh

C Jones and G Plotkin A Probabilistic Powerdomain of Evaluations In

Logic in Computer Science pages IEEE Computer So ciety Press

J G Kemeny and J L Snell Finite Markov Chains D Van Nostrand

S Kusuoka A diusion pro cess on a fractal In K Ito and N Ikeda

editors Probabilistic methods in mathematical physics Kinokuniya

Pro c Taniguchi Intern Symp Katata and Kyoto

S Kusuoka Dirichlet forms on fractals and pro ducts of random matrices

Publ Res Inst Math Sci

J D Lawson Valuations on Continuous Lattices In RudolfEb erhard

Homan editor Continuous Lattices and Related Topics volume of

Mathematik Arbeitspapiere Universitat Bremen

B Mandelbrot The Fractal Geometry of Nature W H Freeman and co

San Francisco

D Monro F Dudbridge and A Wilson Deterministic rendering of self

ane fractals iee collo quium on application of fractal techniques in image

pro cessing In IEE Col loquium on Application of Fractal Techniques in

Image Processing

N SahebDjahromi Cp os of measures for nondeterminism Theoretical

Computer Science

M B Smyth Powerdomains Journal of Computer and Systems Sciences

February

V Stoltenb ergHansen I Lindstrom and E R Grior Mathematical

Theory of Domains volume of Cambridge Tracts in Theoretical Computer

Science Cambridge University Press

S J Vickers Topology Via Logic Cambridge Tracts in Theoretical

Computer Science Cambridge University Press