Fractals Everywhere”
Total Page:16
File Type:pdf, Size:1020Kb
SELF-SIMILARITY IN IMAGING, 20 YEARS AFTER “FRACTALS EVERYWHERE” Mehran Ebrahimi and Edward R.Vrscay Department of Applied Mathematics Faculty of Mathematics, University of Waterloo Waterloo, Ontario, Canada N2L 3G1 [email protected], [email protected] ABSTRACT In this paper, we outline the evolution of the idea of Self-similarity has played an important role in mathemat- self-similarity from Mandelbrot’s “generators” to Iterated ics and some areas of science, most notably physics. We Function Systems and fractal image coding. We then briefly briefly examine how it has become an important idea in examine the use of fractal-based methods for tasks other imaging, with reference to fractal image coding and, more than image compression, e.g., denoising and superresolu- recently, other methods that rely on the local self-similarity tion. Finally, we examine some extensions and variations of images. These include the more recent nonlocal meth- of nonlocal image processing that are inspired by fractal- ods of image processing. based coding. 1. INTRODUCTION 2. FROM SELF-SIMILARITY TO FRACTAL IMAGE CODING Since the appearance of B. Mandelbrot’s classic work, The Fractal Geometry of Nature [28], the idea of self- 2.1. Some history similarity has played a very important role in mathematics and physics. In the late-1980’s, M. Barnsley of Georgia In The Fractal Geometry of Nature, B. Mandelbrotshowed Tech, with coworkers and students, showed that sets of how “fractal” sets could be viewed as limits of iterative contractive maps with associated probabilities, called “It- schemes involving generators. In the simplest case, a gen- erated Function Systems” (IFS), could be used not only erator G acts on a set S (for example, a line segment) to to generate fractal sets and measures but also to approxi- produce N finely-contracted copies of S and then arranges mate natural objects and images. This gave birth to frac- these copies in space according to a prescribed rule. Start- tal image compression, which would become a hotbed of ing with an appropriate “initiator set” S0, the iteration pro- research activity over the next decade. Indeed, twenty cedure Sn+1 = G(Sn) converges, in the limit n → ∞, to years have passed since Fractals Everywhere [4], Barns- a fractal set S. ley’s beautiful exposition of IFS theory and its applica- Last, but certainly not least, the set S is “self-similar,” tions, was first published. meaning that arbitrarily small pieces of S are scaled-down Historically, most fractal image coding research fo- copies of S. Consequently, S can be expressed as a union cussed on its compression capabilities, i.e., obtaining the of contracted copies of itself. best possible accuracy with the smallest possible domain Some of Mandelbrot’s examples were certainly not pool. As a result, these investigations would rarely venture new. For example, in the classical construction of the beyond observing what “optimal” domain blocks could ternary Cantor set, the usual “middle-thirds” dissection provide. By taking a step back, however, and examin- procedure is represented by a generator G that, acting on ing the statistics of how well image subblocks are ap- a line segment of length l, produces two contracted copies proximated by other subblocks, at either the same scale of length l/3 which are separated by a distance l/3. From or different scales, one sees that natural images are gen- this, the Cantor set is viewed as a union of two contracted erally quite self-similar. This actually explains why frac- copies of itself. (And, in turn, we obtain its fractal dimen- tal image coding – a nonlocal image processing method – sion D to be ln 2/ ln 3.) These simple examples, however, “works” (with the acknowledgment that it no longer fur- led to more general constructions, including random frac- nishes a competitive method of image compression). tals. The result was a rather unified treatment of fractal Furthermore,these observationshave led to the formu- geometry. lation of a general model of local affine image self simi- The next leap in fractal analysis and construction came larity [2] that includes, as special cases, cross-scale fractal with the work of J. Hutchinson [24]. One of the main in- coding and same-scale nonlocal means denoising. In other gredients was a set of N contractive maps wi : X → X words, a number of nonlocal image processing methods on a complete metric space (X, d). Associated with these may be viewed under a common theme of self-similarity. maps was a set-valued mapping w, defined as the union of the set-valued actions of the wi. (As such, w performed words a transform T that maps x as close as possible to the shrink-and-placement procedure of Mandelbrot’s gen- itself. For IFS on sets, cf. Eq. (1), this amounts to looking erator G.) Hutchinson showed that w is contractive in for ways in which a set S can be approximated by a union (H(X),h), the complete metric space of non-empty com- of contracted and distorted copies of itself. This procedure pact subsets of X with Hausdorff metric h. From Ba- was illustrated very nicely for a generic leaf-shape in [6]. nach’s theorem, there exists a unique set A ∈ H(X), the In [6] was also presented the infamous “Barnsley fern” attractor of the IFS w, which satisfies the fixed point re- – the attractor of a four-map IFS with probabilities (IFSP). lation But this “fern” was more than an IFS attractor set A – it N was an invariant measure which could be represented on w µ A = (A)= [ wi(A). (1) a computer screen as a shaded image. In other words, the n=1 construction of sets by IFS now becomes the construction A is self-similar since it can be expressed as a union of of images by IFSP. copies of itself. Naturally, the next question was: “Can IFS(P) be used Hutchinson also considered a set of probabilities pi to approximate other natural objects?” But an even more associated with the spatial maps wi, 1 ≤ i ≤ N. These ambitious question was: “Can IFS(P) be used to approxi- are used to define an operator M on the space M(X) of mate images?” There were various attempts (which will Borel probability measures on X: Very briefly, the action not be reviewed here, for lack of space), and Barnsley of M on a measure µ is to produce N spatially-contracted himself announced success to the world in the January copies of µ (via the wi) which are then weighted according 1988 issue of BYTE magazine [3], claiming that astronom- to the respective probabilities pi. The operator M is con- ical rates of data compression could be achieved with IFS tractive in (M(X), dM ), where dM denotes the Monge- image coding. However, the details of the IFS compres- Kantorovich metric. Therefore there exists a unique mea- sion method were not revealed at that time for proprietary sure µ¯ ∈M(X) satisfying the fixed point relation reasons. (Barnsley had been granted a software patent and subsequently founded Iterated Systems Incorporated.) N − The thesis of A. Jacquin [25], one of Barnsley’s Ph.D. µ¯(S)= Mµ¯(S)= p µ¯(w 1(S), ∀S ∈ H(X). X i i students, however, removed much of the mystery behind i=1 (2) IFS image coding. Indeed, Jacquin’s seminal paper [26] Moreover, the support of µ¯ is the IFS attractor A. The in- described in sufficient detail the method of block-based variant measure µ satisfies a more generalized self-similarity fractal image coding which is still the basis of most, if not or self-tiling property. all, fractal-based image coding methods. It is, of course, M. Barnsley and S. Demko [5] independently discov- overly ambitious to expect that an image can be well ap- ered the use of such systems of mappings and associated proximated with smaller parts of itself. Therefore, in the probabilities for the construction of fractal sets and mea- spirit of standard block-based image coding methods (e.g., sures, coining the term “Iterated Function Systems” (IFS). DCT, VQ), range or child subblocks of an image were Their analysis was framed in a more probabilistic setting approximated by affine greyscale transformations of deci- and gave rise to the well-known chaos game algorithm for mated versions of larger domain or parent subblocks. Al- generating pictures of attractors and invariant measures. though Jacquin’s original formulationwas in terms of mea- Even more significant is the following: In the Barns- sures, the method is easily interpreted in terms of image ley/Demko paper was the first suggestion that IFS might functions. be useful for the approximation of natural objects. This Jacquin’s original paper launched an intensive activity was the seed for the inverse problem of fractal-based ap- in fractal image compression. One of the main drawbacks proximation: Given a “target” set S, for example, a leaf, of fractal coding is the time required to search for optimal can one find an IFS with attractor A that approximates S domain blocks. As such, there was much investigation on to some desired degree of accuracy? how to achieve the best quality with as little searching as Subsequently, Barnsley and students [6] showed how possible. A discussion of some of these methods can be the inverse problem could be reformulated/simplified in found in [19, 20, 27]. terms of the following “Collage Theorem,” a simple con- sequence of Banach’s fixed point theorem: Given a con- 2.2. A closer look at block-based fractal image coding traction map T : X → X with contraction factor c ∈ Here we outline the most important features of fractal im- [0, 1) and fixed point x¯, then age coding.