POD: a Smartphone That Flies

Total Page:16

File Type:pdf, Size:1020Kb

POD: a Smartphone That Flies POD: A Smartphone That Flies Guojun Chen Noah Weiner Lin Zhong Yale University Yale University Yale University [email protected] [email protected] [email protected] ABSTRACT We present POD, a smartphone that flies, as a new way to achieve hands-free, eyes-up mobile computing. Unlike exist- ing drone-carried user interfaces, POD features a smartphone- sized display and the computing and sensing power of a modern smartphone. We share our experience in prototyping POD, discuss the technical challenges facing it, and describe early results toward addressing them. Figure 1: POD (top left corner) interacting with our test mannequin mounted on a Wi-Fi-controlled robotic chassis. 1. INTRODUCTION Our driving vision is hands-free, eyes-up mobile comput- human-drone interaction and requires new investigation. ing. That is, mobile users are able to interact with a computer To address these challenges and experiment with POD, without holding a device in hand and looking down at it. The we have implemented a prototype. As shown in Figure 1, key to this vision is a user interface (both input and output) it is based on a mechanical, electrical and computational that does not require a human hand. Wearable devices, such partnership between a micro-controller-based drone and a as Google Glass, are perhaps the most studied implementa- stripped-down smartphone (Google Pixel 4). It has a flight tion of this type of mobile, hands-free user interface. time of four minutes, uses computer vision to track and follow This paper explores an alternative idea for these new user its user, and harnesses the phone’s sensors to improve the interfaces: one carried by a small drone. Compared to wear- usability of its display. able user interfaces, drone-carried UIs do not impose gadgets This paper describes the POD prototype and shares our on the user’s head or eyes. More importantly, they can go experience in building it (§4). It presents our early results (§5) beyond the physical reach of the human body, empowering in addressing the challenges facing POD (§3) and discusses interesting use cases. interesting, novel use cases for it (§2). While there is a growing literature about human-drone in- teraction [16, 31, 13], this work envisions a drone-carried UI as rich as that of modern smartphones, called POD. Unlike 2. THE CASE FOR POD previously studied drone-carried UIs, POD features a small We first describe several compelling applications of POD, (potentially foldable) display and allows bidirectional interac- a smartphone that flies. Because POD carries cameras and tions with mobile users. We imagine that POD would rest on microphones, it can fill roles intended for existing cinemato- the user’s shoulder most of the time and fly out occasionally graphic drones. We ignore these use cases and instead focus arXiv:2105.12610v1 [cs.HC] 26 May 2021 to interact with the user only on demand. on those uniquely enabled by POD’s display. We envision Compared to wearable glasses and other drone-carried UIs, that POD will share much of the hardware and software of POD faces its own unique challenges that have not been smartphones. Our prototype uses a Google Pixel 4, allow- addressed by the human-drone interaction community. First, ing it to tap into the mature ecosystem of Android, which like all flying objects, drones are not steady. This challenges facilitates app development and enables “retrofitting” exist- the usability of an onboard display, especially if the drone ing smartphone apps for POD. This flexibility contrasts with needs to maintain a safe distance of one to two feet from the other drone platforms that develop their own ecosystems, e.g., user’s eyes. Second, drones are noisy. Even the quietest drone DJI and Skydio. operating at one or two feet away from the user produces Hands-free “Smartphone”: POD’s hands-free experience noise at 70 dBA, which is equivalent to being on a busy street opens up a host of new multitasking opportunities for mobile or near a vacuum cleaner [8]. Finally, because we intend users in situations when holding a device in hand is either POD to serve the user on the go, it must decide between inconvenient or impossible. Users could make video calls or following the user or hovering in place when the user moves consume media while cooking, walking around the house, or around. This challenge represents a very specialized case of jogging. 800 Peaks y displacement of static content 700 pixels y displacement of stabilized content 600 500 time/s Environment noise level: -20dB 400 0 2 4 6 8 10 12 14 Figure 2: Y-axis (vertical) movement measured by a camera located Figure 3: Noise spectrum of our POD prototype. It has multiple 20 cm from our POD prototype while the latter was hovering. strong tonal components (peaks) and a relatively wide band. Personable Interaction: There is a growing literature POD moves along the vertical axis during one of its flights, about human-drone interaction with a focus on how users despite its best effort to hover, as captured by a camera. Such command a drone [13], e.g., using gestures [12, 23, 25] and motion poses significant usability challenges for the display, touch controls [9]. Few works, however, examine how a especially for reading small text. Using a drone-carried iPad, drone could communicate with users directly, without using the authors of [29] found, not surprisingly, that a moving an intermediary device like a handheld terminal. For example, display carried by the drone requires a larger text font for Szafir, Mutlu and Fong [30] studied how a drone could use readability, although the study did not isolate the impact of the an LED ring to communicate its flight intention. drone’s inability to stay stationary in the air, as is important POD, with its display, provides a much richer platform for to POD. this type of direct communication. For example, POD can use smiley faces to comfort the user or indicate its intention, which is important because many users are uncomfortable 3.2 Noise Suppression with a drone flying around at such close proximity. POD can Drones are noisy, and drone noise suppression has been also use arrows or other symbols to direct the user’s attention. studied extensively [21]. The noise worsens disproportion- First Response and Public Safety: We also envision POD ately when the user is close to the drone. And POD intends to being used in the context of first response and public safety. serve its user at the intimate smartphone distance, much more Its display can present instructions to first responders as they closely than cinematographic drones, which acerbates the navigate difficult scenarios, like fires and disasters. It could noise problem. Our POD prototype, though quite small, can present interactive maps or other informative graphics to produce noise up to 75 dBA, if measured from two feet away. those trapped in hard-to-reach areas. At the scenes of car Commercial cinematographic drones are about as noisy, or accidents, police could send POD to facilitate a remote video noisier, due to their larger size [3]. Noise this loud is known call with the driver and inspect their license, or to review to be harmful of humans during long periods of exposure [8]. damages and interview witnesses on the scene. Cinematographic drones usually operate far from users, and noise captured by their microphones can largely be removed 3. OPEN CHALLENGES through post-processing. In contrast, POD operates at the smartphone distance from the user and the noise must be While drones for cinematography have been extensively suppressed in real-time. studied and widely used, drones as a personal device have Many have explored using passive methods to reduce drone not. Our experience with the POD prototype has revealed noise: covering the propellers with sound-absorbing materials technical challenges unique to personal drones, especially and metal surfaces [22]; using ducted (or shielded) propellers those related to their interacting with the user at such a close [19]; using specially shaped propellers, e.g., DJI low-noise proximity. propellers [2]; and using propellers with more blades. These methods have only limited success. For example, ducted 3.1 Display Usability propellers [19] only reduce the noise by up to 7 dBA. Drones are dynamic systems. Keeping them stationary The synchrophaser system [17] is an active drone noise in the air is a long-studied and difficult problem. In order cancellation method. It requires that all propellers have ex- to capture high-quality videos and photos, commercial cin- actly the same rotational speed and a fixed phase difference. ematographic drones have gone a long way toward solving It thus only works for fixed-wing aircraft with symmetric this problem. Unlike POD, they benefit from two factors: sets of engines. However, small rotorcraft like drones do not they usually hover far away from the target scene; and the meet the synchrophaser requirements—their rotors constantly impact of instability on the footage they take can be removed change speed to keep the craft stable. Noise cancellation tech- by post-processing. niques widely used in headphones can reduce noise at the ear. POD, on the other hand, must present a stationary, readable These techniques work well with low-frequency (< 2 KHz) display to its user, and at a much closer distance, with the and tonal noise. Unfortunately, as shown in Figure 3, drone screen content often being generated in real time. Figure 2 noise consists of tonal noise of both high and low frequencies, (red, dotted line) presents how the screen of our prototype as well as random wide-band noise higher than 6 KHz.
Recommended publications
  • The Design of Social Drones a Review of Studies on Autonomous Flyers in Inhabited Environments
    CHI 2019 Paper CHI 2019, May 4–9, 2019, Glasgow, Scotland, UK The Design of Social Drones A Review of Studies on Autonomous Flyers in Inhabited Environments Mehmet Aydın Baytaş Damla Çay Yuchong Zhang Koç University KUAR Chalmers University of Technology Istanbul, Turkey Koç University Gothenburg, Sweden Qualisys AB Istanbul, Turkey [email protected] Gothenburg, Sweden [email protected] [email protected] Mohammad Obaid Asım Evren Yantaç Morten Fjeld UNSW Art & Design KUAR Chalmers University of Technology University of New South Wales Koç University Gothenburg, Sweden Sydney, Australia Istanbul, Turkey [email protected] [email protected] [email protected] ABSTRACT CCS CONCEPTS The design space of social drones, where autonomous flyers • Human-centered computing → Interaction design; operate in close proximity to human users or bystanders, Empirical studies in interaction design; is distinct from use cases involving a remote human opera- tor and/or an uninhabited environment; and warrants fore- KEYWORDS grounding human-centered design concerns. Recently, re- Drones, social drones, drone design, empirical studies, user search on social drones has followed a trend of rapid growth. studies, design knowledge, human-drone interaction, au- This paper consolidates the current state of the art in human- tonomous agents centered design knowledge about social drones through a ACM Reference Format: review of relevant studies, scaffolded by a descriptive frame- Mehmet Aydın Baytaş, Damla Çay, Yuchong Zhang, Mohammad work of design knowledge creation. Our analysis identified Obaid, Asım Evren Yantaç, and Morten Fjeld. 2019. The Design three high-level themes that sketch out knowledge clusters of Social Drones: A Review of Studies on Autonomous Flyers in in the literature, and twelve design concerns which unpack Inhabited Environments.
    [Show full text]
  • A Comprehensive Review of Applications of Drone Technology in the Mining Industry
    drones Review A Comprehensive Review of Applications of Drone Technology in the Mining Industry Javad Shahmoradi 1, Elaheh Talebi 2, Pedram Roghanchi 1 and Mostafa Hassanalian 3,* 1 Department of Mineral Engineering, New Mexico Tech, Socorro, NM 87801, USA; [email protected] (J.S.); [email protected] (P.R.) 2 Department of Mining Engineering, University of Utah, Salt Lake City, UT 84112, USA; [email protected] 3 Department of Mechanical Engineering, New Mexico Tech, Socorro, NM 87801, USA * Correspondence: [email protected] Received: 4 June 2020; Accepted: 13 July 2020; Published: 15 July 2020 Abstract: This paper aims to provide a comprehensive review of the current state of drone technology and its applications in the mining industry. The mining industry has shown increased interest in the use of drones for routine operations. These applications include 3D mapping of the mine environment, ore control, rock discontinuities mapping, postblast rock fragmentation measurements, and tailing stability monitoring, to name a few. The article offers a review of drone types, specifications, and applications of commercially available drones for mining applications. Finally, the research needs for the design and implementation of drones for underground mining applications are discussed. Keywords: drones; remote sensing; surface mining; underground mining; abandoned mining 1. Introduction Drones, including unmanned air vehicles (UAVs) and micro air vehicles (MAVs), have been used for a variety of civilian and military applications and missions. These unmanned flying systems are able to carry different sensors based on the type of their missions, such as acoustic, visual, chemical, and biological sensors. To enhance the performance and efficiency of drones, researchers have focused on the design optimization of drones that has resulted in the development and fabrication of various types of aerial vehicles with diverse capabilities.
    [Show full text]
  • Dronesar: Extending Physical Spaces in Spatial Augmented Reality Using Projection on a Drone
    DroneSAR: Extending Physical Spaces in Spatial Augmented Reality using Projection on a Drone Rajkumar Darbar, Joan Sol Roo, Thibault Lainé, Martin Hachet Inria Bordeaux, France {rajkumar.darbar,joan-sol.roo,thibault.laine,martin.hachet}@inria.fr Figure 1: An example scenario of DroneSAR. (A) A physical house mock-up. (B) A drone is mounted with two white paper panels. (C) The house is augmented using projection, and the main menu composed of a set of virtual tools projected on the drone panel. (D) A user selected the ‘measuring tool’ application using a controller. Then, the user positions the drone at the desired location in the 3D space (i.e., on top of the house) and draws a line shown in blue color on the augmented house to measure its width. Finally, the measured length is displayed on the drone panel. ABSTRACT ACM Reference Format: Spatial Augmented Reality (SAR) transforms real-world objects into Rajkumar Darbar, Joan Sol Roo, Thibault Lainé, Martin Hachet. 2019. Drone- SAR: Extending Physical Spaces in Spatial Augmented Reality using Projec- interactive displays by projecting digital content using video pro- tion on a Drone. In MUM 2019: 18th International Conference on Mobile and jectors. SAR enables co-located collaboration immediately between Ubiquitous Multimedia (MUM 2019), November 26–29, 2019, Pisa, Italy. ACM, multiple viewers without the need to wear any special glasses. Un- New York, NY, USA, 7 pages. https://doi.org/10.1145/3365610.3365631 fortunately, one major limitation of SAR is that visual content can only be projected onto its physical supports. As a result, displaying 1 INTRODUCTION User Interfaces (UI) widgets such as menus and pop-up windows in SAR is very challenging.
    [Show full text]
  • UBIACTION 2020 Florian Lang Pascal Knierim Albrecht Schmidt
    UBIACTION 2020 Edited by Florian Lang, Pascal Knierim, Jakob Karolus, Fiona Draxler, Ville Mäkelä, Francesco Chiossi, Luke Haliburton, Matthias Hoppe, Albrecht Schmidt UBIACTION 2020 5th Seminar on Ubiquitous Interaction January 27, 2020, Munich, Germany Edited by Florian Lang Pascal Knierim Albrecht Schmidt UBIACTION 2020 – Vol. 5 Vorwort Editors Florian Lang, Pascal Knierim, Jakob Karolus, Fiona Draxler, Ville Mäkelä, In der heutigen Gesellschaft ist die Allgegenwärtigkeit von Computern nahe- Francesco Chiossi, Luke Haliburton, Matthias Hoppe, Albrecht Schmidt zu unabdingbar und die Digitalisierung hat massive Auswirkungen auf die verschiedenen Seiten unseres Lebens und Arbeitens. So wandelt sich die Inter- Human-Centered Ubiquitous Media Group aktion mit unserer Umgebung zunehmend von der Mensch-Mensch-Interaktion Institut für Informatik zur Mensch-Computer-Interaktion und Kommunikation mit anderen Menschen Ludwig-Maxmilians-Universität München [email protected] findet zunehmend durch digitale Technologien statt. Dieser Wandel eröffnet vor Allem in der Forschung die Tore zu neuen Möglichkeiten der Interaktion zwi- schen Menschen und Computern. Das als „Ubiquitous Computing“ bezeichnete Forschungsgebiet untersucht unter anderem die Zugänglichkeit und Einfachheit der Interaktion mit mobilen und tragbaren Geräten und deren Einfluss auf ACM Classification 1998 - H.5 INFORMATION INTERFACES AND PRESENTATION die Erfahrungen des Menschen. Am Lehrstuhl „Human-Centered Ubiquitous Media“ der Ludwig-Maximilians-Universität München forschen Studierende kontinuierlich in diesem Gebiet. Im Rahmen des Seminars „Menschenzentrier- te Interaktion mit ubiquitären Computersystemen“ fertigen die Studierende ISBN-13: 979-8635125052 wissenschaftliche Arbeiten, um einen Einblick in zukünftige Möglichkeiten der Mensch-Computer-Interaktion zu gewähren. Auf der Veranstaltung „Ubiaction“ wurden am 27.01.2020 die Forschungsergebnisse des Seminars in Form von 15-minütigen Vorträgen präsentiert.
    [Show full text]
  • Towards Investigating the Design Space of Data Physicalisation
    Graduation thesis submitted in partial fulfilment of the requirements for the degree of Master of Science in Applied Sciences and Engineering: Applied Computer Science TOWARDS INVESTIGATING THE DESIGN SPACE OF DATA PHYSICALISATION AHMED KAREEM A ABDULLAH Academic year 2017–2018 Promoter: Prof. Dr. Beat Signer Advisor: Payam Ebrahimi Faculty of Engineering c Vrije Universiteit Brussel, all rights reserved. i Abstract Data physicalisation is a new research area that spans Information Visual- isation (InfoVis) and Scientific Visualisation (SciVis) [117]. However, there are many other disciplines which have been actively researching for novel methods to represent and explore data in order to keep up with the growth in data and its forms. For instance, the field of human-computer interaction endeavours finding new forms of interaction with data beyond the traditional mice and keyboards. For instance, researchers in the field of Tangible User Interfaces (TUIs), have been focusing on coupling the physical and digital domains into one coherent system. Eventually, this provides users an im- mersive experience while exploring data by affording direct interactions with the underlying models through common physical artefacts. Nevertheless, the research interests provide new aspirations to find interaction terms beyond the static physicalisation in tangibles. We also have some visionary work such as the Radical Atoms proposed by Ishii et al. [49] promising a new hori- zon of interactions considering the possibilities allowed by the programmable matter, which is a dynamically reconfigurable material of the future. The WISE lab is also involved in the design-driven research towards enabling programmable matter. Researchers in the WISE Lab explored the idea of simulating programmable matter and investigated some important concerns to the development of a system with such a capacity.
    [Show full text]
  • Department Magazine SH 2020
    MAGAZINE T E C H S C I E NFLIPBOOKC E B y t h e I T D e p a r t m e n t S H 2 0 2 0 The Journey of Apple, Sixth Sense Tech, Cybercrime, AND MUCH MORE! LATEST NEWS! Everything you need to know in Boston Dynamics wows one Magazine. us again, SolarWinds hacked - know the whole story and its SCIENTIFIC circumstances! FEATS! Asteroid Treasures and Detecting Galaxy Collisions for the very first time and it doesn't stop there! Check it out! No more needles for diagnostics and the Zodiac Cipher solved! TechScience Edition Tribute Look inside to know more. Mathematician John F. Nash Jr. 7 E V xt SC en od sio e Py n E th s f nt o or hu n In sia si st de s ! EDITORS: FACULTY ADVISORS: ARFAH UPADE, VAISHNAVI MANTRI, PAVITRA VENKATRAMAN, RABIYA IDRISHI DR. LAKSHMISUDHA HOD/IT DESIGNERS: MALVIKA SELVAN, AFTAAB SHEIKH PROF. BUSHRA SHAIKH CONTENT WRITERS: S. ANANTHASELVI, VANITHA REDDY, MELVINA MICHAEL, SAARABI PARKAR, ATHARVA DESHPANDE, SHUBHADARSHINI NADAR T E C H S C I E N C E Department of Information Technology B y t h e I T D e p a r t m e n t S H 2 0 2 0 Vision To develop IT professionals for accomplishment of industrial & societal needs through quality education. The Journey of Apple, Sixth Sense Tech, Cybercrime, AND MUCH Mission MORE! LATEST NEWS! Everything you need to know in Boston Dynamics wows 1. To impart advanced knowledge and develop skills in Information Tech- one Magazine.
    [Show full text]
  • Detection Thresholds for Vertical Gains in VR and Drone-Based Telepresence Systems
    2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Detection Thresholds for Vertical Gains in VR and Drone-based Telepresence Systems Keigo Matsumoto* Eike Langbehn† Takuji Narumi‡ Frank Steinicke§ Cyber Interface Lab. Human-Computer Interaction Cyber Interface Lab. Human-Computer Interaction The University of Tokyo University of Hamburg The University of Tokyo University of Hamburg Figure 1: Illustration of the concept of a drone-based telepresence system using a vertical gain: (left) the user wears a virtual reality head-mounted display (HMD) moving in the local environment (LE), (center) the drone-based telepresence system in the remote environment (RE), and (right) the users view of the RE on the HMD. ABSTRACT 1INTRODUCTION Several redirected walking techniques have been introduced and ana- There was a great progression in the field and market of drones over lyzed in recent years, while the main focus was on manipulations in the past years. In particular, the evolution of commercial drones horizontal directions, in particular, by means of curvature, rotation, has progressed rapidly; a drone equipped with advanced attitude and translation gains. However, less research has been conducted control and a camera can be purchased from $ 100. There are many on the manipulation of vertical movements and its possible use as a possibilities for these commercial drones, and one of them is the redirection technique. Actually, vertical movements are fundamen- usage as telepresence drone system. Unlike other telepresence robots, tally important, e.g., for remotely steering a drone using a virtual telepresence drones can move freely in 3D space, in particular, they reality headset.
    [Show full text]
  • China's Aviation Industry: Lumbering Forward •
    China’s Aviation Industry: Lumbering Forward • A CASI Monograph Prepared by TextOre, Inc. Peter Wood & Robert Stewart, Ph.D Printed in the United States of America by the China Aerospace Studies Institute ISBN: 9781082740404 To request additional copies, please direct inquiries to Director, China Aerospace Studies Institute, Air University, 55 Lemay Plaza, Montgomery, AL 36112 Design by Heisey-Grove Design Cover photo by Dr. Brendan S. Mulvaney, China International Aviation & Aerospace Exhibition, 2018 All photos licensed under the Creative Commons Attribution-Share Alike 4.0 International license, or under the Fair Use Doctrine under Section 107 of the Copyright Act for nonprofit educational and noncommercial use. All other graphics created by or for China Aerospace Studies Institute E-mail: [email protected] Web: http://www.airuniversity.af.mil/CASI Twitter: https://twitter.com/CASI_Research | @CASI_Research Facebook: https://www.facebook.com/CASI.Research.Org LinkedIn: https://www.linkedin.com/company/11049011 Disclaimer The views expressed in this academic research paper are those of the authors and do not necessarily reflect the official policy or position of the U.S. Government or the Department of Defense. In accordance with Air Force Instruction 51-303, Intellectual Property, Patents, Patent Related Matters, Trademarks and Copyrights; this work is the property of the U.S. Government. Limited Print and Electronic Distribution Rights Reproduction and printing is subject to the Copyright Act of 1976 and applicable treaties of the United States. This document and trademark(s) contained herein are protected by law. This publication is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited.
    [Show full text]
  • Extending Physical Spaces in Spatial Augmented Reality Using Projection on a Drone Rajkumar Darbar, Joan Roo, Thibault Lainé, Martin Hachet
    DroneSAR: Extending Physical Spaces in Spatial Augmented Reality using Projection on a Drone Rajkumar Darbar, Joan Roo, Thibault Lainé, Martin Hachet To cite this version: Rajkumar Darbar, Joan Roo, Thibault Lainé, Martin Hachet. DroneSAR: Extending Physical Spaces in Spatial Augmented Reality using Projection on a Drone. MUM’19 - 18th International Confer- ence on Mobile and Ubiquitous Multimedia, Nov 2019, Pisa, Italy. 10.1145/3365610.3365631. hal- 02409351 HAL Id: hal-02409351 https://hal.inria.fr/hal-02409351 Submitted on 13 Dec 2019 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. DroneSAR: Extending Physical Spaces in Spatial Augmented Reality using Projection on a Drone Rajkumar Darbar, Joan Sol Roo, Thibault Lainé, Martin Hachet Inria Bordeaux, France {rajkumar.darbar,joan-sol.roo,thibault.laine,martin.hachet}@inria.fr Figure 1: An example scenario of DroneSAR. (A) A physical house mock-up. (B) A drone is mounted with two white paper panels. (C) The house is augmented using projection, and the main menu composed of a set of virtual tools projected on the drone panel. (D) A user selected the ‘measuring tool’ application using a controller. Then, the user positions the drone at the desired location in the 3D space (i.e., on top of the house) and draws a line shown in blue color on the augmented house to measure its width.
    [Show full text]
  • Drone Warfare: War in the Age of Mechanical
    In this paper I explore the scopic regime of drone warfare as the production of the image as a site of meaning. The first part of the paper I describe what a drone is, through its technical specifications and through detailed reports on actual drone attacks in the recent ‘War on Terror.’ I highlight some of the contemporary debates surrounding its use and how drone technology has transformed the mechanization of war. In the second part of the paper I relate drone technology to a historical framework, specifically referring to the work of Walter Benjamin and his seminal essay, ”The Work of Art in the Age of Mechanical Reproduction.” This moves through a discussion of the relationship between vision and war during the First World War and political implications of representations of mechanized warfare during the aesthetic movement of the Futurists. The final section of the paper approaches specific aspects of drone technology that depart from mechanized warfare into a digital realm. These aspects connect to the development of artificial visual intelligence programs and the primacy of visual pattern recognition being increasingly utilized in drone surveillance. I highlight concepts in the work of Paul Virilio in his book, “The Vision Machine” such as telepresence and the industrialization of vision, in examining the contemporary implications of drone technology. Figure 1. Profilo Continuo. Renato Bertelli. 1933 28 Figure 2. Roman God of Janus 29 Figure 3. Continuous Profile (of George W. Bush) Julian LaVerdiere. 2004 29 Figure 4. Tribute in Light Memorial. Julian LaVerdiere. 2002 30 Figure 5. Cathedral of Light. Albert Speer.
    [Show full text]
  • समाचार प से च यत अंश Newspapers Clippings
    Feb 2021 समाचार प�� से च�यत अंश Newspapers Clippings A Daily service to keep DRDO Fraternity abreast with DRDO Technologies, Defence Technologies, Defence Policies, International Relations and Science & Technology खंड : 46 अंक : 27 06-08 फरवर� 2021 Vol.: 46 Issue : 27 06-08 February 2021 र�ा �व�ान पु�तकालय Defenceर�ा �व�ान Science पु�तकालय Library र�ा Defenceवै�ा�नक स Scienceूचना एवं �लेखन Library क� � Defence Scientificर�ा Informationवै�ा�नक सूचना &एवं Documentation �लेखन क� � Centre Defence Scientificमेटकॉफ Information हाउस, �द�ल� &- Documentation110 054 Centre Metcalfeमेटकॉफ House,हाउस, �द�ल� Delhi ­- 110110 054 054 Metcalfe House, Delhi­ 110 054 CONTENTS S. No. TITLE Page No. DRDO News 1-25 DRDO Technology News 1-25 1. DRDO hands over Licensing Agreements for Transfer of Technology for 14 1 technologies to 20 industries 2. डीआरडीओ ने 20 उ�योग� को 14 �ौ�यो�ग�कय� के �लए �ौ�यो�गक� ह�तांतरण के �लए 2 लाइस� संग समझौते स�पे 3. DRDE develop standard for Nuclear, Biological & chemical war protective 3 clothing 4. DFRL unveils food products for forces, Gaganyaan astronauts 4 5. Historic defence order of 118 indigenous Arjun 1A tanks on the anvil 5 6. नेवी के �लए बन रहा है डबल इंजन एयर�ा�ट, �मग-29K क� लेगा जगह 6 7. BDL launches new products at Aero India-2021 7 8. DIAT to research, develop robotics platforms for armed forces 8 9. Uttarakhand floods: Glacier bursts in winter 'next to impossible' say defence 9 scientists 10.
    [Show full text]
  • Design of a Tangible Drone Swarm As a Programmable Matter Interface
    BitDrones: Design of a Tangible Drone Swarm as a Programmable Matter Interface By John Calvin Rubens A thesis submitted to the Graduate Program in Electrical and Computer Engineering in conformity with the requirements for the Degree of Master of Applied Science Queen’s University Kingston, Ontario, Canada September 2019 Copyright © John Calvin Rubens, 2019 Abstract This thesis details the design and functionality of BitDrones, a programmable matter interface (PMI) composed of micro cuboid drones. Each self-levitating BitDrone represented a tangible voxel and possessed four degrees of freedom to position itself in three dimensions and control its orientation about the z-axis. A single BitDrone consisted of a custom designed quadcopter suspended inside a carbon fiber wireframe cube with onboard RGB LEDs for illumination. As a swarm, these drones formed a human-computer interface that could be physically manipulated via manual user input as a Real Reality Interface (RRI). RRIs render interactive digital experiences via the manipulation of physical matter but faced several functional limitations until now. Historically, RRI elements were not self-levitating and could only create structurally stable models, or such elements had self-levitating capabilities but were not tangible and could only create sparse three-dimensional models. The spatial independence, tangibility and self-motility of each BitDrone voxel granted more versatility than previous RRIs, and this versatility enabled BitDrones to meet the functional criteria of a PMI as defined in the Human-Computer Interaction (HCI) literature. This work presents the evolving design of the BitDrones, the key components of the computational architecture that governed the system, as well as the tangible interactions, tools and controllers designed to use the BitDrones display.
    [Show full text]