Fractal Dimension and Space Filling Curve

Total Page:16

File Type:pdf, Size:1020Kb

Fractal Dimension and Space Filling Curve Fractal Dimension and Space Filling Curve Ana Karina Gomes [email protected] Instituto Superior T´ecnico,Lisboa, Portugal May 2015 Abstract Space-filling curves are computer generated fractals that can be used to index low-dimensional spaces. There are previous studies using them as an access method in these scenarios, although they are very conservative only applying them up to four-dimensional spaces. Additionally, the alternative access methods tend to present worse performance than a linear search when the spaces surpass the ten dimensions. Therefore, in the context of my thesis, I study the space-filling curves and their properties as well as challenges presented by multidimensional data. I propose their use, specifically the Hilbert curve, as an access method for indexing multidimensional points up to thirty dimensions. I start by mapping the points to the Hilbert curve generating their h-values. Then, I develop three heuristics to search for approximate nearest neighbors of a given query point with the aim of testing the performance of the curve. Two of the heuristics use the curve as a direct access method and the other uses the curve as a secondary key retrieval combined with a B-tree variant. These result from an iterative process that basically consists of planning, conceiving and testing the heuristic. According to the test results, the heuristic is adjusted or a new one is created. Experimental results with the three heuristics prove that the Hilbert curve can be used as an access method, and that it can operate at least in spaces up to thirty-six dimensions. Keywords: Fractals, Hilbert Space-Filling Curve, Low-Dimensional Indexing, Approximate Nearest Neighbor 1. Introduction tors to simplify and represent data in order to lower Data is becoming increasingly complex. It con- the complexity. The features are represented by tains an extensive set of attributes and sometimes points in the high dimensional space and the closer requires to be seen as part of a multidimensional the points in space, the more similar the objects space. Database Management Systems (DBMSs) are. In the case of images, these are usually rep- have undergone several changes in its access meth- resented by feature vectors, which are then used to ods in order to support this type of data. Through- find approximately similar objects. In this case, an out the years, several indexes have been proposed to approximate result is an acceptable solution. Multi- handle multidimensional (or spatial) data, but the media data is only an example among several other most popular is the R-tree. Despite all the effort applications. The approximate nearest neighbor, as dedicated to improve it, R-tree performance does its name suggests, does not guarantee the return of not appear to keep up the increasing of the dimen- the exact neighbor, and its performance is highly sions. It is common to multidimensional indexes related to the data distribution [27] to suffer from the curse of dimensionality which Ideally, in order to reduce the search time, all means they present a similarity search performance the multidimensional data that are related should worse than a linear search in spaces that exceed ten be linearly stored on disk or memory. However, this dimensions [18, 24, 28]. Even with other indexes cannot be done because it is impossible to guarantee with better performance, Mamoulis considers that that all the pair of objects close in space will be all similarity search in these spaces is a difficult and close in the linear order. It is impossible to all but unsolved problem in its generic form [18]. not for some, and the space-filling curves provide a The approximate nearest neighbor is a similarity total order of the points in space with superior clus- search technique that neglect the accuracy of query tering property. Another problem relating the mul- results for reduced response time and memory us- tidimensional data refers to its indexing structures. age. It is an invaluable search technique in high Although, there are several low-dimensional index- dimensional spaces. Commonly, we use feature vec- ing structures, they present a problem that concerns 1 with the number of dimensions increase. When this proposals. Although there is no consensus on the happens, the volume of the space increases so fast best structure for multidimensional indexing, the that the data in the space become sparse. The curse R tree is the most commonly used for benchmark of dimensionality appears when the spaces usually the performance of new proposals. Furthermore, in surpass the ten dimensions, comparing their per- commercial DBMSs, the R-trees are preferred due formance to a linear search or worse [8]. Berch- to their suitability for points and regions and orien- told et al. refer that they also suffer from the well- tation to low-dimensional space [28, 22]. The list of known drawbacks of multidimensional index struc- alternative access methods to compete with fractals tures such as high costs for insert and delete oper- in low-dimensional space (two to thirty dimensions) ations and a poor support of concurrency control are quite extensive. Therefore, I choose the R-tree and recovery. A possible solution to this problem and the Kd-tree as two possible alternatives to take is to use dimensionality reduction techniques like a closer look. the space-filling curves [3, 8]. They allow to map a 2.1. R-tree d-dimensional space to a one-dimensional space in Since the multidimensional indexes are primarily order to reduce the complexity. After this, it is pos- conceived for secondary storage, the notion of data sible to use a B-tree variant to store the data and page is a concern. These access methods perform take advantage of all the properties of these struc- a hierarchical clustering of the data space, and the tures such as fast insert, update and delete opera- clusters are usually covered by page regions. To tions, good concurrency control and recovery, easy each data page is assigned a page region which is implementation and re-usage of the B-tree imple- a subset of the data space. The page regions vary mentation. Additionally, the space-filling curve can according to the index, and in the R-tree case has be easily implemented on top of an existing DBMS the name of minimum bounding rectangle (MBR) [3]. [5]. At a higher-resolution level, each MBR rep- For all the reasons mentioned above, the purpose resents a single object and is stored on the lower of this paper is to study the space-filling curves and leaf nodes. These leafs contain also an identifier of their properties as well as challenges presented by a multidimensional point or region that represents multidimensional data. I explore the use of the some data. The R-tree descends from the B+-tree space-filling curves, specifically the Hilbert curve, and therefore, preserves the height-balanced prop- as an access method for indexing multidimensional erty. Thus, at the higher levels will have cluster of points. I also test three heuristics to search for objects, and since they are represented by rectan- an approximate nearest neighbor on the Hilbert gles, we will also have clusters of rectangles [28, 22]. curve. The main goal is to explore the potential of The R-tree search performance relates to the size the Hilbert curve in low dimensional spaces (up to of the MBRs. Note that if there are too many over- thirty dimensions). These spaces may result from lapping regions, when performing a search it will the application of indexing structures oriented to result in several intersections with several subtrees high-dimensional space or result from the applica- of a node. This will require to traverse them all, and tion of dimensionality reduction techniques for sub- we must be considering that the search occurs in a sequent indexing. In either situation, the resulting multidimensional space. Thus, the minimization of space, despite being of lower dimensions, still re- both coverage and overlap, has a large impact on quires a large computational effort to perform op- the performance [28, 22]. Other proposals based erations on data. on R tree have been presented over the years. The 2. Multidimensional Indexing R*-tree tries to improve the R-tree by choosing cov- There are several structures oriented for indexing erage of a square shape rather than the traditional multidimensional data. These vary according to rectangle. This shape allows a reduction in cov- their structure, if they have taken into account or erage area by reducing the overlap and hence im- not the data distribution, if they are targeted for proving performance of the tree. Like the R*-tree, points or regions, if they are more suited to high many other structures have been proposed based on or low dimensional spaces, etc. For example, struc- the R-tree as the R+-tree, the X-tree, the Hilbert tures such as Kd-trees [1, 2], hB-trees [17], Grid files R-tree among others. However, there is also an- [20], Point Quad trees among others, are oriented other problem that haunts these hierarchical index- to indexing points. Structures such as the Region ing structures known as the curse of dimensionality. Quad trees [12], SKD [21] or R trees can handle The main underlying cause relates to the volume of both points or regions [22]. Subspace-tree [25] and the chosen form that covers the points. This has a LSD-tree are more suited to high dimensions and constant radius or edge size which increases expo- R-tree, Kd-tree and Space-Filling Curves are more nentially with the increasing of dimensions [4, 26]. suited do low dimensional spaces.
Recommended publications
  • Harmonious Hilbert Curves and Other Extradimensional Space-Filling Curves
    Harmonious Hilbert curves and other extradimensional space-filling curves∗ Herman Haverkorty November 2, 2012 Abstract This paper introduces a new way of generalizing Hilbert's two-dimensional space-filling curve to arbitrary dimensions. The new curves, called harmonious Hilbert curves, have the unique property that for any d0 < d, the d-dimensional curve is compatible with the d0-dimensional curve with respect to the order in which the curves visit the points of any d0-dimensional axis-parallel space that contains the origin. Similar generalizations to arbitrary dimensions are described for several variants of Peano's curve (the original Peano curve, the coil curve, the half-coil curve, and the Meurthe curve). The d-dimensional harmonious Hilbert curves and the Meurthe curves have neutral orientation: as compared to the curve as a whole, arbitrary pieces of the curve have each of d! possible rotations with equal probability. Thus one could say these curves are `statistically invariant' under rotation|unlike the Peano curves, the coil curves, the half-coil curves, and the familiar generalization of Hilbert curves by Butz and Moore. In addition, prompted by an application in the construction of R-trees, this paper shows how to construct a 2d-dimensional generalized Hilbert or Peano curve that traverses the points of a certain d-dimensional diagonally placed subspace in the order of a given d-dimensional generalized Hilbert or Peano curve. Pseudocode is provided for comparison operators based on the curves presented in this paper. 1 Introduction Space-filling curves A space-filling curve in d dimensions is a continuous, surjective mapping from R to Rd.
    [Show full text]
  • Redalyc.Self-Similarity of Space Filling Curves
    Ingeniería y Competitividad ISSN: 0123-3033 [email protected] Universidad del Valle Colombia Cardona, Luis F.; Múnera, Luis E. Self-Similarity of Space Filling Curves Ingeniería y Competitividad, vol. 18, núm. 2, 2016, pp. 113-124 Universidad del Valle Cali, Colombia Available in: http://www.redalyc.org/articulo.oa?id=291346311010 How to cite Complete issue Scientific Information System More information about this article Network of Scientific Journals from Latin America, the Caribbean, Spain and Portugal Journal's homepage in redalyc.org Non-profit academic project, developed under the open access initiative Ingeniería y Competitividad, Volumen 18, No. 2, p. 113 - 124 (2016) COMPUTATIONAL SCIENCE AND ENGINEERING Self-Similarity of Space Filling Curves INGENIERÍA DE SISTEMAS Y COMPUTACIÓN Auto-similaridad de las Space Filling Curves Luis F. Cardona*, Luis E. Múnera** *Industrial Engineering, University of Louisville. KY, USA. ** ICT Department, School of Engineering, Department of Information and Telecommunication Technologies, Faculty of Engineering, Universidad Icesi. Cali, Colombia. [email protected]*, [email protected]** (Recibido: Noviembre 04 de 2015 – Aceptado: Abril 05 de 2016) Abstract We define exact self-similarity of Space Filling Curves on the plane. For that purpose, we adapt the general definition of exact self-similarity on sets, a typical property of fractals, to the specific characteristics of discrete approximations of Space Filling Curves. We also develop an algorithm to test exact self- similarity of discrete approximations of Space Filling Curves on the plane. In addition, we use our algorithm to determine exact self-similarity of discrete approximations of four of the most representative Space Filling Curves.
    [Show full text]
  • Fractal Initialization for High-Quality Mapping with Self-Organizing Maps
    Neural Comput & Applic DOI 10.1007/s00521-010-0413-5 ORIGINAL ARTICLE Fractal initialization for high-quality mapping with self-organizing maps Iren Valova • Derek Beaton • Alexandre Buer • Daniel MacLean Received: 15 July 2008 / Accepted: 4 June 2010 Ó Springer-Verlag London Limited 2010 Abstract Initialization of self-organizing maps is typi- 1.1 Biological foundations cally based on random vectors within the given input space. The implicit problem with random initialization is Progress in neurophysiology and the understanding of brain the overlap (entanglement) of connections between neu- mechanisms prompted an argument by Changeux [5], that rons. In this paper, we present a new method of initiali- man and his thought process can be reduced to the physics zation based on a set of self-similar curves known as and chemistry of the brain. One logical consequence is that Hilbert curves. Hilbert curves can be scaled in network size a replication of the functions of neurons in silicon would for the number of neurons based on a simple recursive allow for a replication of man’s intelligence. Artificial (fractal) technique, implicit in the properties of Hilbert neural networks (ANN) form a class of computation sys- curves. We have shown that when using Hilbert curve tems that were inspired by early simplified model of vector (HCV) initialization in both classical SOM algo- neurons. rithm and in a parallel-growing algorithm (ParaSOM), Neurons are the basic biological cells that make up the the neural network reaches better coverage and faster brain. They form highly interconnected communication organization. networks that are the seat of thought, memory, con- sciousness, and learning [4, 6, 15].
    [Show full text]
  • Efficient Neighbor-Finding on Space-Filling Curves
    Universitat¨ Stuttgart Efficient Neighbor-Finding on Space-Filling Curves Bachelor Thesis Author: David Holzm¨uller* Degree: B. Sc. Mathematik Examiner: Prof. Dr. Dominik G¨oddeke, IANS Supervisor: Prof. Dr. Miriam Mehl, IPVS October 18, 2017 arXiv:1710.06384v3 [cs.CG] 2 Nov 2019 *E-Mail: [email protected], where the ¨uin the last name has to be replaced by ue. Abstract Space-filling curves (SFC, also known as FASS-curves) are a useful tool in scientific computing and other areas of computer science to sequentialize multidimensional grids in a cache-efficient and parallelization-friendly way for storage in an array. Many algorithms, for example grid-based numerical PDE solvers, have to access all neighbor cells of each grid cell during a grid traversal. While the array indices of neighbors can be stored in a cell, they still have to be computed for initialization or when the grid is adaptively refined. A fast neighbor- finding algorithm can thus significantly improve the runtime of computations on multidimensional grids. In this thesis, we show how neighbors on many regular grids ordered by space-filling curves can be found in an average-case time complexity of (1). In 풪 general, this assumes that the local orientation (i.e. a variable of a describing grammar) of the SFC inside the grid cell is known in advance, which can be efficiently realized during traversals. Supported SFCs include Hilbert, Peano and Sierpinski curves in arbitrary dimensions. We assume that integer arithmetic operations can be performed in (1), i.e. independent of the size of the integer.
    [Show full text]
  • Math Morphing Proximate and Evolutionary Mechanisms
    Curriculum Units by Fellows of the Yale-New Haven Teachers Institute 2009 Volume V: Evolutionary Medicine Math Morphing Proximate and Evolutionary Mechanisms Curriculum Unit 09.05.09 by Kenneth William Spinka Introduction Background Essential Questions Lesson Plans Website Student Resources Glossary Of Terms Bibliography Appendix Introduction An important theoretical development was Nikolaas Tinbergen's distinction made originally in ethology between evolutionary and proximate mechanisms; Randolph M. Nesse and George C. Williams summarize its relevance to medicine: All biological traits need two kinds of explanation: proximate and evolutionary. The proximate explanation for a disease describes what is wrong in the bodily mechanism of individuals affected Curriculum Unit 09.05.09 1 of 27 by it. An evolutionary explanation is completely different. Instead of explaining why people are different, it explains why we are all the same in ways that leave us vulnerable to disease. Why do we all have wisdom teeth, an appendix, and cells that if triggered can rampantly multiply out of control? [1] A fractal is generally "a rough or fragmented geometric shape that can be split into parts, each of which is (at least approximately) a reduced-size copy of the whole," a property called self-similarity. The term was coined by Beno?t Mandelbrot in 1975 and was derived from the Latin fractus meaning "broken" or "fractured." A mathematical fractal is based on an equation that undergoes iteration, a form of feedback based on recursion. http://www.kwsi.com/ynhti2009/image01.html A fractal often has the following features: 1. It has a fine structure at arbitrarily small scales.
    [Show full text]
  • Algorithms for Scientific Computing
    Algorithms for Scientific Computing Space-Filling Curves in 2D and 3D Michael Bader Technical University of Munich Summer 2016 Classification of Space-filling Curves Definition: (recursive space-filling curve) A space-filling curve f : I ! Q ⊂ Rn is called recursive, if both I and Q can be divided in m subintervals and sudomains, such that (µ) (µ) • f∗(I ) = Q for all µ = 1;:::; m, and • all Q(µ) are geometrically similar to Q. Definition: (connected space-filling curve) A recursive space-filling curve is called connected, if for any two neighbouring intervals I(ν) and I(µ) also the corresponding subdomains Q(ν) and Q(µ) are direct neighbours, i.e. share an (n − 1)-dimensional hyperplane. Michael Bader j Algorithms for Scientific Computing j Space-Filling Curves in 2D and 3D j Summer 2016 2 Connected, Recursive Space-filling Curves Examples: • all Hilbert curves (2D, 3D, . ) • all Peano curves Properties: connected, recursive SFC are • continuous (more exact: Holder¨ continuous with exponent 1=n) • neighbourship-preserving • describable by a grammar • describable in an arithmetic form (similar to that of the Hilbert curve) Related terms: • face-connected, edge-connected, node-connected, . • also used for the induced orders on grid cells, etc. Michael Bader j Algorithms for Scientific Computing j Space-Filling Curves in 2D and 3D j Summer 2016 3 Approximating Polygons of the Hilbert Curve Idea: Connect start and end point of iterate on each subcell. Definition: The straight connection of the 4n + 1 points h(0); h(1 · 4−n); h(2 · 4−n);:::; h((4n − 1)
    [Show full text]
  • The Fractal Construction by Iterated Function Systems
    THE FRACTAL CONSTRUCTION BY ITERATED FUNCTION SYSTEMS Elvira Goga Polytechnic University of Tirana, Sheshi Nënë Tereza, Tirana, [email protected] Abstract This paper presents the theory and algorithms for computing fractals from Iterated Function Systems ( IFS ). The algorithms presented are the Deterministic Algorithm and the Random Iteration Algorithm in the space ( ) with the Hausdorff metric. The Hausdorff metric is the most natural metric in comparing objects in an ideal geometric space, like the fractal objects illustrated in the paper. These algorithmℝ =s 2will illustrate through visual means. The fundamental property of an Iterated Function System is that each IFS determines a unique attractor, which is typically a fractal. Any set of affine transformations and an associated set of probabilities determines an Iterated Function System. These probabilities play an important role in the computation of images of the attractor of an Iterated Function System using the Random Iteration Algorithm. They play no role in the Deterministic Algorithm. One of the advantages of using an Iterated Function System is that the dimension of the attractor is often relatively easy to estimate in terms of the defining contractions. A dimension contains much informacion about the geometrical properties of a set. This paper discusses about the Hausdorff dimension and box dimension of the attractor of an Iterated Function System consisting of contractions that are similarities. Finally, this paper gives construction of some fractals by Iterated Function Systems and finds the Hausdorff and box dimensions of some self-similar fractals. Keywords: fractal, iteration, deterministic algorithm, random algorithm, dimension 1. Introduction The fractal, as a mathematical phenomenon, dates back to the Weierstrass nowhere differentiable continuous function (Figure 7), to the classic Cantor set (Figure 8), to the Hilbert space filling curve (Figure 9), and even beyond.
    [Show full text]
  • Fractint Formula for Overlaying Fractals
    Journal of Information Systems and Communication ISSN: 0976-8742 & E-ISSN: 0976-8750, Volume 3, Issue 1, 2012, pp.-347-352. Available online at http://www.bioinfo.in/contents.php?id=45 FRACTINT FORMULA FOR OVERLAYING FRACTALS SATYENDRA KUMAR PANDEY1, MUNSHI YADAV2 AND ARUNIMA1 1Institute of Technology & Management, Gorakhpur, U.P., India. 2Guru Tegh Bahadur Institute of Technology, New Delhi, India. *Corresponding Author: Email- [email protected], [email protected] Received: January 12, 2012; Accepted: February 15, 2012 Abstract- A fractal is generally a rough or fragmented geometric shape that can be split into parts, each of which is a reduced-size copy of the whole a property called self-similarity. Because they appear similar at all levels of magnification, fractals are often considered to be infi- nitely complex approximate fractals are easily found in nature. These objects display self-similar structure over an extended, but finite, scale range. Fractal analysis is the modelling of data by fractals. In general, fractals can be any type of infinitely scaled and repeated pattern. It is possible to combine two fractals in one image. Keywords- Fractal, Mandelbrot Set, Julia Set, Newton Set, Self-similarity Citation: Satyendra Kumar Pandey, Munshi Yadav and Arunima (2012) Fractint Formula for Overlaying Fractals. Journal of Information Systems and Communication, ISSN: 0976-8742 & E-ISSN: 0976-8750, Volume 3, Issue 1, pp.-347-352. Copyright: Copyright©2012 Satyendra Kumar Pandey, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
    [Show full text]
  • Amplification Matrix Iteration Algorithm to Generate the Hilbert Peano Curve
    Transactions on Computer Science and Technology June 2014, Volume 3, Issue 2, PP.41-47 Amplification Matrix Iteration Algorithm to Generate the Hilbert-Peano Curve Zhengwen Peng, Xin Lu# Department of Mathematics and Computer Science, Nanchang Normal University, 330032, China #Corresponding author: [email protected] Abstract The methods to construct the Hilbert-Peano curve are recursive and non-recursive; methods of L-grammar system, different methods have different characteristics. In this document, from the prospective of non-recursive to analyze the characteristics of Hilbert-Peano curve, adopt the amplification curve characteristic matrix iteration algorithm constructs the Hilbert-Peano curve, compares the algorithm with the non-recursive algorithm in time-space-consuming. Prove that this algorithm has more advantages on large scale than L-system of non-recursive algorithm. Keywords: Hilbert-Peano Curve; Fractal; Iteration Matrix 1 INTRODUCTION For Euclidean geometry, consider a one-dimensional only line or line segment, the two-dimensional nature belongs to the plane. In 1890, however, mathematicians G.Peano studying G.Cantor set in an attempt to use the curve to fill on the basis of two-dimensional plane, mathematician Hilbert subsequently proved feasible in 1891, and found the famous two-dimensional Hilbert-Peano plane filling curve [1], as shown in Fig. 1. FIG. 1 THE HILBERT-PEANO CURVE GENERATED ITERATION 3 TIMES In environment at that time, mathematicians realized plane filling curve for just stay on the level of questioning curiosity, thought by the seemingly simple structure of the line can constitute a complex curve, and uninterrupted don't repeat to fill the whole plane, that curve was of no use Euclidean geometry to explain and measure it, more can't imagine after one hundred will be widely used in computer graphics, image processing technology.
    [Show full text]
  • Interactive Evolutionary 3D Fractal Modeling
    Interactive Evolutionary 3D Fractal Modeling PANG,Wenjun A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Philosophy Mechanical and Automation Engineering The Chinese University of Hong Kong September 2009 im s <• •广t ; • •产Villis. •- :. •• . ., r. »> I Thesis / Assessment Committee Professor Ronald Chung (Chair) Professor K.C. Hui (Thesis Supervisor) Professor Charlie C.L. Wang (Committee Member) ACKNOWLEDGEMENTS My sincere appreciation goes to my supervisor, Professor Kinchuen Hui, who continuously orients my research towards the right direction, provides full academic and spiritual support and corrects my English in thesis writing. Without his constant guidance and encouragement, this work would not have been finished successfully. I would also express my deep gratitude to Professor Charlie C. L. Wang for his valuable advices and support to my research. All the fellow members in Computer-Aided Design Lab and staffs in the Department General Office deserve my appreciation for making my life easier and happier during my study. I would like to express special thanks to Wang Chengdong, whose selfless help and encouragement give me so much confidence in my life and research. I am thankful to Yuki for helping me with her ideas that improved my understanding of the scope of my work. Ill ABSTRACT Research on art fractal has captured wide attention and gained considerable achievement in the past two decades. Most related works focus on developing two dimensional fractal art, and the fractal art tools usually just assist in the creation process, but cannot perform an automatic generation associated with aesthetic evaluation. The percentage of visually attractive fractals generated by fractal art construction formula, such as Iterated Function Systems, is not high.
    [Show full text]
  • Space-Filling Curves
    Chapter 2 Space-Filling Curves Summary. We begin with an example of a space-filling curve and demon- strate how it can be used to find a short tour through a set of points. Next we give a general introduction to space-filling curves and discuss properties of them. We then consider the space-filling curve heuristic for the traveling salesperson problem and show how a corresponding order of the points can be computed fast, in particular in a probabilistic setting. Finally, we consider a discrete version of space-filling curves and present experimental results on discrete space-filling curves optimized for special tasks. 2.1 Introduction 2.1.1 Example: Heuristic for Traveling Salesperson Problem A space-filling curve maps a 1-dimensional space onto a higher-dimensional space, e.g., the unit interval onto the unit square. We will use space-filling curves in the form of the space-filling curve heuristic for the NP-hard Euclidean traveling salesperson problem [67]. The Euclidean traveling salesperson problem is the problem of finding the shortest closed tour through a set of points. Figure 2.1 shows a point set, tours through the point set constructed using the space-filling curve heuristic with two different space-filling curves, and the shortest tour through the point set. We demonstrate the space-filling curve heuristic for this task by the example of the two-dimensional Hilbert curve [78]. Consider the following construction (a) Point set (b) Heuristic using (c) Heuristic using (d) Shortest tour Moore curve Sierpi´nskicurve Figure 2.1: Short tours through a point set.
    [Show full text]
  • Fractal Examples Handout
    (Adapted from “List of fractals by Hausdorff dimension”, Wikipedia) Approx. Exact dimension Name Illustration dimension 2 log 3 1.12915 Contour of the Gosper island 7 ( ) 1.2083 Fibonacci word fractal 60° 3 log φ 3+√13 log� 2 � Solution of Boundary of the tame 1.2108 2 1 = 2 twindragon 2− 2 − log 4 1.2619 Triflake 3 log 4 1.2619 Koch curve 3 Boundary of Terdragon log 4 1.2619 curve 3 log 4 1.2619 2D Cantor dust 3 (Adapted from “List of fractals by Hausdorff dimension”, Wikipedia) log 4 1.2619 2D L-system branch 3 log 5 1.4649 Vicsek fractal 3 Quadratic von Koch curve log 5 1.4649 (type 1) 3 log 1.4961 Quadric cross 10 √5 � 3 � Quadratic von Koch curve log 8 = 1.5000 3 (type 2) 4 2 log 3 1.5849 3-branches tree 2 (Adapted from “List of fractals by Hausdorff dimension”, Wikipedia) log 3 1.5849 Sierpinski triangle 2 log 3 1.5849 Sierpiński arrowhead curve 2 Boundary of the T-square log 3 1.5849 fractal 2 log = 1.61803 A golden dragon � 1 + log 2 1.6309 Pascal triangle modulo 3 3 1 + log 2 1.6309 Sierpinski Hexagon 3 3 log 1.6379 Fibonacci word fractal 1+√2 Solution of Attractor of IFS with 3 + + 1.6402 similarities of ratios 1/3, 1/2 1 1 and 2/3 3 = 12 � � � � 2 �3� (Adapted from “List of fractals by Hausdorff dimension”, Wikipedia) 32-segment quadric fractal log 32 = 1.6667 5 (1/8 scaling rule) 8 3 1 + log 3 1.6826 Pascal triangle modulo 5 5 50 segment quadric fractal 1 + log 5 1.6990 (1/10 scaling rule) 10 4 log 2 1.7227 Pinwheel fractal 5 log 7 1.7712 Hexaflake 3 ( ) 1.7848 Von Koch curve 85° log 4 log 2+cos 85° log 6 1.8617 Pentaflake
    [Show full text]