Exploration, Interpolation and Extrapolation-Each Approached in a Different Manner

Total Page:16

File Type:pdf, Size:1020Kb

Exploration, Interpolation and Extrapolation-Each Approached in a Different Manner I dedicate this thesis to my husband, Amiel, for his constant support and unconditional love. Acknowledgements I would like to express my special appreciation and thanks to my PhD advisors, Professors Shlomi Dolev and Zvi Lotker, for supporting me during these past five years. You have been a tremendous mentors for me. I would like to thank you for encouraging my research and for allowing me to grow as a research scientist. Your scientific advice and knowledge and many insightful discussions and suggestions have been priceless. I would also like to thank my committee members, professor Eitan Bachmat, professor Chen Avin, professor Amnon Ta-Shma for their helpful comments and suggestions in general. A heartfelt thanks to the really supportive and active BGU community here in Beer-Sheva and all my friends who made the research experience something special, in particular, Ariel, Dan, Nisha, Shantanu, Martin, Nova, Eyal, Guy and Marina. Special thanks to Daniel and Elisa for proof reading my final draft. A special thanks to my family. Words cannot express how grateful I am to my mother-in law, father-in-law, my mother, and father for all of the sacrifices that youve made onmy behalf. Finally, I would like to acknowledge the most important person in my life my husband Amiel. He has been a constant source of strength and inspiration. There were times during the past five years when everything seemed hopeless and I didnt have any hope. I can honestly say that it was only his determination and constant encouragement (and sometimes a kick on my backside when I needed one) that ultimately made it possible for me to see this project through to the end. Abstract The abundance in the amount of data is forcing us to redefine many scientific and techno- logical fields, with the affirmation of any environment of Big Data as a potential source of data.The advent of Big Data is introducing important innovations: the availability of additional external data sources, dimensions previously unknown and questionable consis- tency poses new challenges to computer scientists, demanding a general reconsideration that involves tools, software, methodologies and organizations. This thesis investigates the problem of big data abstraction in the scope of exploration, in- terpolation and extrapolation. The driving vision of data abstraction is to turn the information overload into an opportunity: the goal of the abstraction is to make our way of processing data and information transparent for an analytic discourse as well as a tool for complete missing information, predicate unknown features and filter noise and outliers. We confront three aspects of the abstraction problems with gradual levels of generaliza- tion. First we focus on a specific data type and propose a novel solution for exploring the connectivity threshold of wireless data when the number of the sensors approach infinity. Second, we consider how to use polynomials to effectively and succinctly interpolate general data functions that tolerate noise as well as bounded number of maliciously corrupted outliers. Third, we show how to represent a high-dimensional data set with incomplete information that fulfills the demand of predictive modeling. Our main contribution lies in rethinking these problems in the context of massive amounts of data which dictate large volumes and high dimensionality. Information extraction, exploration and extrapolation have a major impact on our society. We believe that the topics investigated in this thesis can potentially influence greatly and practically. Table of contents List of figures ix 1 Introduction1 1.1 The Information Age . .1 1.2 Big Data Abstraction . .3 1.2.1 Big Data Exploration . .3 1.2.2 Big Data Interpolation . .4 1.2.3 Big Data Extrapolation . .4 2 Probabilistic Connectivity Threshold for Directional Antenna Widths7 2.1 Introduction . .7 2.2 Preliminaries . .9 2.2.1 Notations . 11 2.2.2 Probability and the relation between Uniform and Poisson distributions 12 2.2.3 Covering and Connectivity . 12 2.3 Centered Angles . 13 2.3.1 Finding the Connectivity Threshold . 17 2.4 Random Angle Direction . 19 2.5 Discussion . 23 2.6 Appendix . 24 3 Big Data Interpolation using Functional Representation 27 3.1 Introduction . 27 3.2 Discrete Finite Noise . 30 3.2.1 Handle the discrete noise . 30 3.2.2 Multidimensional Data. 32 3.3 Random Sample with Unrestricted Noise . 36 3.3.1 Polynomial fitting to noisy data . 36 viii Table of contents 3.3.2 Byzantine Elimination. 39 3.4 Discussion . 40 4 Mending Missing Information in Big-Data 43 4.1 Introduction . 43 4.2 Preliminaries . 45 4.3 k-flats Clustering . 47 4.4 Algorithm . 53 4.5 Experimental Studies of k-Flat Clustering . 55 4.6 Clustering with different group sizes . 55 4.7 Sublinear and distributed algorithms . 58 4.8 Discussion . 59 4.9 Conclusion . 61 4.10 Appendix: The probability of flats intersection . 62 5 Conclusions 67 Nomenclature 69 References 71 List of figures 2.1 Directional antenna model. .9 2.2 The communication graph over the disk and the disk’s boundary. 10 2.3 Covering vs. connectivity problems. 13 2.4 Project nodes from the disk onto antipodal pair on the boundary. 14 2.5 Transforming the antipodal pair to a node on the boundary. 16 2.6 Projection a node from the boundary to a node on the disk. 17 2.7 A node and its intercepted arc. 18 2.8 The disk’s cover expansion. 18 2.9 Represent the three dimensional variable of the graph using a torus. 20 2.10 Represent the minimal coverage area by an annulus. 21 2.11 The possible directions that induce adjacency. 22 2.12 Generalize to convex fat objects with curvature> 0 ............. 24 4.1 Two dimensional pair of flats intersecting a disk . 52 4.2 The distance between the midpoint and the ball’s center . 53 4.3 Eliminate the irrelevant midpoints . 56 4.4 Almost orthogonal flats pairwise intersection . 60 Chapter 1 Introduction 1.1 The Information Age Almost 35 years ago, Alvin Toffler [54] published his book “The Third Wave” where he described three phases of human society’s development based on the concept of ‘waves’, with each wave pushing the older societies and cultures aside. According to Toffler, civilization can be divided into three major phases: The First Wave, referred to as the settled Agricultural society, and which replaced the first hunter-gatherer cultures. The symbol of this ageis the hoe, and the profile of the wealthy person is the land owner. Battles were typically carried out with swords. The Second Wave is the Industrial age society, symbolized by the machine, beginning with the industrial revolution. At this time, the wealthy were the factory owners and machines were used during times of war - tanks, aircrafts etc. The Third Wave is the post-industrial society. Toffler says that since the late 1950s, most countries havebeen transitioning into the Information age. The symbol now is obviously the computer. The wealthy and powerful people are those that develop or collect the data and sell others the privilege to use it, and one of the main threats in this modern age is the cyber attack. At the beginning of the 80’s, no one could have imagined the significance and power that data would play in everyday life. Today, Big data - a large pool of data that can be captured, communicated, aggregated, stored and analyzed, is part of every sector and function of the global economy [36]. The use of big data can create significant value for the world economy, enhancing the productivity and competitiveness of companies and the public sector, and creating a substantial economic surplus for consumers. 2 Introduction Big data challenges. There are many different definitions for the term ‘Big Data’. Generally, this term refers toa massive amount of data, the size of which is beyond the ability of typical database software tools to capture, store, manage, and analyze. A popular abbreviation which is commonly used to characterize it is the three V’s: Volume, Variety and Velocity. By Volume, we usually mean the sheer size of the data, that is of course the major challenge, and the most easily recognized. By Variety, we mean heterogenity of data types, representation, and semantic interpretation. The meaning of Velocity is both the rate at which data arrives and the speed in which it needs to be processed- for example, to perform fraud detection at a sale point. Another important feature of big data includes not only the huge number of items but also their ‘wideness’, i.e., each item maintains many fields. Hence, it is common to describe these items by objects in a high-dimensional space. High-dimensional objects have a number of unintuitive properties that are sometimes referred to as the ‘curse of dimensionality’ [6]. Multiple dimensions are hard to think in, impossible to visualize, and due to the exponential growth of the number of possible values with each dimension, complete enumeration of all subspaces becomes intractable with increasing dimensionality. One manifestation of the ‘curse’ is that in high dimensions, almost all pairs of points are equally far away from one another and almost any two vectors are nearly orthogonal. Another manifestation is that high dimensional functions tend to have more complex features than low-dimensional functions, and are hence harder to estimate. Moreover, in order to obtain a statistically sound and reliable result, e.g., to estimate multivariate functions with the same accuracy as functions in low dimensions, we require that the sample size grow exponentially with the dimension. The emerging field of data science relates those aspects of big data and provides them with different solutions which are fundamentally multidisciplinary.
Recommended publications
  • Covered Objects∗
    The Complexity of the Union of (α, β)-Covered Objects∗ Alon Efrat† January 28, 2000 Abstract An (α, β)-covered object is a simply connected planar region c with the property that for each point p ∈ ∂c there exists a triangle contained in c and having p as a vertex, such that all its angles are at least α and all its edges are at least β · diam(c)- long. This notion extends that of fat convex objects. We show that the combinatorial complexity of the union of n (α, β)-covered objects of ‘constant description complexity’ 2 is O(λs+2(n)log n log log n), where s is the maximum number of intersections between the boundaries of any pair of the given objects. 1 Introduction A planar object c is (α, β)-covered if the following conditions are satisfied. 1. c is simply-connected; 2. For each point p ∈ ∂c we can place a triangle ∆ fully inside c, such that p is a vertex of ∆, each angle of ∆ is at least α, and the length of each edge of ∆ is at least β ·diam(c). We call such a triangle ∆ a good triangle for c. The notion of (α, β)-covered objects generalizes the notion of convex fat objects. A planar convex object c is α-fat if the ratio between the radii of the balls s+ and s− is at most α, where s+ is the smallest ball containing c and s− is a largest ball that is contained in c. It is easy to show that an α-fat convex object is an (α′, β′)-covered object, for appropriate constants α′, β′ that depend on α.
    [Show full text]
  • On the Theoretical Complexity of the Silhouette of a Polyhedron
    On the theoretical complexity of the silhouette of a polyhedron M´emoire soutenu le vendredi 5 septembre 2003 par Marc Glisse 1 pour l'obtention du dipl^ome d'´etudes approfondies sous la direction de Sylvain Lazard 2 1Ecole´ Normale Sup´erieure, 45, rue d'Ulm, 75005 Paris, France. Email: [email protected] 2Inria Lorraine, Nancy. Email: [email protected] Abstract We study conditions under which the silhouette of a polyhedron is guaranted to be sublinear, either on average or in the worst case, and give some counter-examples when not. Contents 1 Introduction 2 2 Definitions and general remarks 2 2.1 View, silhouette . 2 2.2 Projective invariance . 2 2.3 kD-fatness . 2 2.4 Duality . 3 3 Polytopes 3 3.1 First examples . 3 3.2 Apparent length of a polytope . 4 3.3 Worst-case complexity for polytopes . 5 3.3.1 The cylinder example . 5 3.3.2 Local theorem . 6 3.3.3 Global theorem . 6 3.3.4 Existence of such polytopes . 7 3.4 Average complexity for polytopes . 7 3.5 Lower bound for polytopes . 8 4 Approximation of a surface 9 4.1 Kettner and Welzl . 9 4.2 Average case . 9 4.2.1 Smooth surface . 9 4.2.2 Generalization . 10 4.2.3 Shadow . 10 4.3 Worst case . 10 4.3.1 The sphere . 10 4.3.2 Other surfaces . 10 5 Conclusion 11 1 1 Introduction Given a viewpoint, the apparent boundary of a polyhedron (in 3D), or silhouette, is the set of edges incident to a visible face and an invisible one, and in the neighbourhood of which one can see infinity; a face whose supporting plane contains the viewpoint is considered invisible.
    [Show full text]
  • Approximation Algorithms for Polynomial-Expansion and Low-Density Graphs∗
    Approximation Algorithms for Polynomial-Expansion and Low-Density Graphs∗ Sariel Har-Peledy Kent Quanrudz September 23, 2015 Abstract We investigate the family of intersection graphs of low density objects in low dimensional Eu- clidean space. This family is quite general, includes planar graphs, and in particular is a subset of the family of graphs that have polynomial expansion. We present efficient (1 + ")-approximation algorithms for polynomial expansion graphs, for Independent Set, Set Cover, and Dominating Set problems, among others, and these results seem to be new. Naturally, PTAS's for these problems are known for subclasses of this graph family. These results have immediate interesting applications in the geometric domain. For example, the new algorithms yield the only PTAS known for covering points by fat triangles (that are shallow). We also prove corresponding hardness of approximation for some of these optimization problems, characterizing their intractability with respect to density. For example, we show that there is no PTAS for covering points by fat triangles if they are not shallow, thus matching our PTAS for this problem with respect to depth. 1. Introduction Many classical optimization problems are intractable to approximate, let alone solve. Motivated by the discrepancy between the worst-case analysis and real-world success of algorithms, more realistic models of input have been developed, alongside algorithms that take advantage of their properties. In this paper, we investigate approximability of some classical optimization problems (e.g., set cover and independent set, among others) for two closely-related families of graphs: Graphs with polynomially- bounded expansion, and intersection graphs of geometric objects with low-density.
    [Show full text]
  • TANGRAM TREEMAPS Method with Various Shapes
    An enclosure geometrical partitioning TANGRAM TREEMAPS method with various shapes Tangram Treemaps An enclosure geometrical partitioning method with various shapes By Jie Liang Supervisor: A/Prof. Mao Lin Huang Co-supervisor: Dr. Quang Vinh Nguyen A thesis submitted in fulfilment for the Degree of Doctor of Philosophy In the Faculty of Engineering and Information Technology 1 | Page DEC 2012 Tangram Treemaps CERTIFICATE OF AUTHORSHIP/ORIGINALITY UNIVERSITY OF TECHNOLOGY SYDNEY I certify that the work in this thesis has not previously been submitted for a degree nor has it been submitted as part of requirements for a degree except as fully acknowledged within the text. I also certify that the thesis has been written by me. Any help that I have received in my research work and the preparation of the thesis itself has been acknowledged. In addition, I certify that all information sources and literature used are indicated in the thesis. SIGNATURE OF STUDENT Jie Liang i | Page ACKNOWLEDGEMENTS I would like to gratefully acknowledge the enthusiastic supervision of A/Prof Mao Lin Huang, during this research. He brought me closer to the reality I had initially perceived, eventually enabling me to grasp its rich complexity. This thesis grew out of a series of dialogues with him. The guidance, motivation and friendship of my co-supervisor, Dr. Quang Vinh Nguyen, has been invaluable on both an academic and a personal level, for which I am extremely grateful. Furthermore, I owe sincere thankfulness to the members of Visualization Team and fellow researchers and the professors of iNext Research centre. This research also benefited tremendously from many researchers and staffs in The University of Technology, Sydney.
    [Show full text]
  • New Data Structures and Algorithms for Mobile Data
    New data structures and algorithms for mobile data Citation for published version (APA): Abam, M. A. (2007). New data structures and algorithms for mobile data. Technische Universiteit Eindhoven. https://doi.org/10.6100/IR630204 DOI: 10.6100/IR630204 Document status and date: Published: 01/01/2007 Document Version: Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers) Please check the document version of this publication: • A submitted manuscript is the version of the article upon submission and before peer-review. There can be important differences between the submitted version and the official published version of record. People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page numbers. Link to publication General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal.
    [Show full text]
  • Decompositions and Applications
    Algorithms for fat objects : decompositions and applications Citation for published version (APA): Gray, C. M. (2008). Algorithms for fat objects : decompositions and applications. Technische Universiteit Eindhoven. https://doi.org/10.6100/IR636648 DOI: 10.6100/IR636648 Document status and date: Published: 01/01/2008 Document Version: Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers) Please check the document version of this publication: • A submitted manuscript is the version of the article upon submission and before peer-review. There can be important differences between the submitted version and the official published version of record. People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page numbers. Link to publication General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal.
    [Show full text]
  • Domination in Geometric Intersection Graphs
    Domination in Geometric Intersection Graphs Thomas Erlebach1 and Erik Jan van Leeuwen2,? 1 Department of Computer Science, University of Leicester, University Road, Leicester LE1 7RH, UK, [email protected] 2 CWI, Kruislaan 413, 1098 SJ Amsterdam, the Netherlands, [email protected] Abstract. For intersection graphs of disks and other fat objects, polynomial-time approximation schemes are known for the independent set and vertex cover prob- lems, but the existing techniques were not able to deal with the dominating set problem except in the special case of unit-size objects. We present approximation algorithms and inapproximability results that shed new light on the approxima- bility of the dominating set problem in geometric intersection graphs. On the one hand, we show that for intersection graphs of arbitrary fat objects, the dominat- ing set problem is as hard to approximate as for general graphs. For intersec- tion graphs of arbitrary rectangles, we prove APX-hardness. On the other hand, we present a new general technique for deriving approximation algorithms for various geometric intersection graphs, yielding constant-factor approximation al- gorithms for r-regular polygons, where r is an arbitrary constant, for pairwise homothetic triangles, and for rectangles with bounded aspect ratio. For arbitrary fat objects with bounded ply, we get a (3 + )-approximation algorithm. 1 Introduction We study the approximability of the minimum dominating set problem in geometric in- tersection graphs. Given an undirected graph G = (V, E), a set D ⊆ V is a dominating set if every v ∈ V is in D or has a neighbor in D.
    [Show full text]
  • Christer Ericson — «Real Time Collision Detection
    Accurate and efficient collision detection in complex environments is one of the foundations of today’s cutting-edge computer games. Yet collision detection is notoriously difficult to implement robustly and takes up an increasingly large fraction of compute cycles in current game engines as increasingly detailed environments are becoming the norm. Real-time Collision Detection is a comprehensive reference on this topic,covering it with both breadth and depth. Not only are the fundamental algorithms explained clearly and in detail, but Ericson’s book covers crucial implementation issues, including geometric and numeric robustness and cache-efficient implementations of the algorithms. Together, these make this book a“must have”practical reference for anyone interested in developing interactive applications with complex environments. –Matt Pharr, Senior Software Developer, NVIDIA Christer Ericson’s Real-time Collision Detection is an excellent resource that covers the fundamentals as well as a broad array of techniques applicable to game development. –Jay Stelly, Senior Engineer,Valve Christer Ericson provides a practical and very accessible treatment of real-time collision detection.This includes a comprehensive set of C++ implementations of a very large number of routines necessary to build such applications in a context which is much broader than just game programming. The programs are well-thought out and the accompanying discussion reveals a deep understanding of the graphics, algorithms, and ease of implementation issues. It will find a welcome home on any graphics programmer’s bookshelf although it will most likely not stay there long as others will be constantly borrowing it. –Hanan Samet, Professor of Computer Science, University of Maryland Real-Time Collision Detection is an excellent resource that every serious engine programmer should have on his bookshelf.
    [Show full text]
  • Improved Bounds for the Union of Locally Fat Objects in the Plane∗
    SIAM J. COMPUT. c 2014 Society for Industrial and Applied Mathematics Vol. 43, No. 2, pp. 543–572 IMPROVED BOUNDS FOR THE UNION OF LOCALLY FAT OBJECTS IN THE PLANE∗ † ‡ § ¶ BORIS ARONOV ,MARKDEBERG,ESTHEREZRA, AND MICHA SHARIR Abstract. We show that, for any γ>0, the combinatorial complexity of the union of n locally γ- ∗ n O(log n) γ fat objects of constant complexity in the plane is γ4 2 . For the special case of -fat triangles, O n ∗ n n 2 1 the bound improves to ( log + γ log γ ). Key words. combinatorial geometry, union complexity, fat objects AMS subject classifications. 05D99, 52C45, 68U05, 68R05 DOI. 10.1137/120891241 1. Introduction. In this paper we obtain sharper upper bounds on the complex- ity of the union of n locally γ-fat objects of constant complexity in the plane, and of nγ-fat triangles in the plane; see below for the definitions of these classes of objects and for the precise statements of our bounds. Background. Consider a family F of well-behaved and simply shaped geometric objects in the plane; we will formally refer to them as objects of constant complexity, and give a precise definition of this notion below. For now assume F to consist of Jordan regions with interiors so that every pair of boundaries intersect in at most some fixed constant number of points. We denote the union of F by U(F). The (combinatorial) complexity of U(F), which we denote by |U(F)|, is defined as the total number of vertices of the union boundary, which can either be vertices of original objects or intersections between pairs of object boundaries.
    [Show full text]
  • Geometric Algorithms for Object Placement and Planarity in a Terrain
    Geometric Algorithms for Object Placement and Planarity in a Terrain Rahul Ray Max-Planck Institut fur¨ Informatik Geometric Algorithms for Object Placement and Planarity in a Terrain Geometric Algorithms for Object Placement and Planarity in a Terrain Rahul Ray Dissertation zur Erlangung des Grades Doktor der Ingenieurwissenschaften (Dr.-Ing.) der Naturwissenschaftlich-Technischen Fakultat¨ I der Universitat¨ des Saarlandes Saarbruck¨ en July 27th, 2004 Max-Planck Institut fur¨ Informatik Tag des Kolloquiums: 27 July 2004 Dekan: Prof. Dr. Philipp Slusallek Gutachter: Prof. Dr. Kurt Mehlhorn Prof. Dr. Stefan Schirra Dedicated to my parents Acknowledgements Many people have taught, inspired, encouraged, supported, helped and advised me during the time in which I worked on this thesis. I wish to express my deepest gratitude to all of them, without which this thesis would never happen. First of all, I would like to thank my advisor, Prof. Dr. Kurt Mehlhorn, for providing me a perfect balance of scientific guidance and scientific freedom. He has always been a great source of motivation whenever I needed. I would like to thank him for creating a wonderful research atmosphere in Max-Planck Institut fur¨ Informatik, Saarbruck¨ en. I also wish to thank Dr. Stefan Funke and Dr. Theocharis Malamatos for co-guiding my research work that led to this thesis. They are great persons to work with and I consider myself privileged to have collaborated with them in my research. I share some of the most enjoyable moments in MPI with them which came along during many of our enlighting discussions or exchanges of innu- merable emails.
    [Show full text]
  • Guarding Fat Polygons and Triangulating Guarded Polygons
    Guarding Fat Polygons and Triangulating Guarded Polygons G. Aloupis ∗ P. Bose † V. Dujmovic ‡ C. Gray § S. Langerman ¶ B. Speckmann § Abstract In this paper we study three related problems: (i) Given a fat polygon P , we show how to find a set G of points in P such that every point on the boundary of P sees at least one point of G. The set G is said to guard the boundary of P and its cardinality depends on the shape parameters of P . Fat polygons are often used to model more realistic inputs. (ii) Given k points that guard the boundary of P , we show how to compute O(k2) points that guard the interior of P . (iii) Given k points that guard P , we provide two simple O(kn) time algorithms to triangulate P : one that triangulates the polygon when the guards are given as input and the other slightly more complicated algorithm that triangulates the polygon when the guards are not given. 1 Introduction Many algorithms and data structures developed in Computational Geometry often display their worst-case performance on intricate configurations that seem artificial. Indeed, in practical situ- ations, these algorithms tend to perform much better than predicted by the theoretical bounds. An attempt to understand this disparity has led to the study of geometric problems with so-called realistic input models [5]. Here one places certain restrictions on the shape and/or distribution of the input objects, so that hypothetical worst-case examples are excluded. One of the most widely studied realistic input models assumes that the input objects are fat, that is, they are not arbitrar- ily long and skinny.
    [Show full text]
  • Motion Planning Amidst Fat Obstacles
    Motion Planning amidst Fat Obstacles Motion Planning tussen Vette Obstakels met een samenvatting in het Nederlands PROEFSCHRIFT ter verkrijging van de graad van do ctor aan de Universiteit Utrecht op gezag van de Rector Magnicus Prof dr JA van Ginkel ingevolge het b esluit van het College van Decanen in het op enbaar te verdedigen op vrijdag oktob er des middags te uur do or Arnoldus Franciscus van der Stapp en geb oren op oktob er te Eindhoven Promotor Prof dr MH Overmars Faculteit Wiskunde en Informatica CIPGEGEVENS KONINKLIJKE BIBLIOTHEEK DEN HAAG Stapp en Arnoldus Franciscus van der Motion planning amidst fat obstacles Arnoldus Franciscus van der Stapp en Utrecht Universiteit Utrecht Faculteit Wiskunde Informatica Pro efschrift Universiteit Utrecht Met index lit opg Met samenvatting in het Nederlands ISBN Trefw geometrie rob otica algoritmen The research in this thesis was supp orted by the Netherlands Organization for Sci entic Research NWO and partially supp orted by the ESPRIT I I BRA Pro ject ALCOM and the ESPRIT I I I BRA Pro ject PROMotion Contents Introduction The general motion planning problem Exact motion planning algorithms Fatness in geometry and thesis outline Fatness in computational geometry Fatness Computing the fatness of an ob ject Prop erties of scenes of fat ob jects Fatness implies low density Arrangements of fat ob ject wrappings Assembling and disassembling fat ob jects Fatness dened with resp ect to other shap es Range searching and p oint lo cation among fat ob jects
    [Show full text]