Detection Thresholds for Vertical Gains in VR and Drone-Based Telepresence Systems

Total Page:16

File Type:pdf, Size:1020Kb

Detection Thresholds for Vertical Gains in VR and Drone-Based Telepresence Systems 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Detection Thresholds for Vertical Gains in VR and Drone-based Telepresence Systems Keigo Matsumoto* Eike Langbehn† Takuji Narumi‡ Frank Steinicke§ Cyber Interface Lab. Human-Computer Interaction Cyber Interface Lab. Human-Computer Interaction The University of Tokyo University of Hamburg The University of Tokyo University of Hamburg Figure 1: Illustration of the concept of a drone-based telepresence system using a vertical gain: (left) the user wears a virtual reality head-mounted display (HMD) moving in the local environment (LE), (center) the drone-based telepresence system in the remote environment (RE), and (right) the users view of the RE on the HMD. ABSTRACT 1INTRODUCTION Several redirected walking techniques have been introduced and ana- There was a great progression in the field and market of drones over lyzed in recent years, while the main focus was on manipulations in the past years. In particular, the evolution of commercial drones horizontal directions, in particular, by means of curvature, rotation, has progressed rapidly; a drone equipped with advanced attitude and translation gains. However, less research has been conducted control and a camera can be purchased from $ 100. There are many on the manipulation of vertical movements and its possible use as a possibilities for these commercial drones, and one of them is the redirection technique. Actually, vertical movements are fundamen- usage as telepresence drone system. Unlike other telepresence robots, tally important, e.g., for remotely steering a drone using a virtual telepresence drones can move freely in 3D space, in particular, they reality headset. can move vertically, and therefore telepresence drones can execute a In this paper, we explored vertical gains, a novel redirection wide range of tasks, such as telecommunications [3, 31], sports [10], technique, which enables us to purposefully manipulate the mapping and remote operations [30]. For instance, drones are already used of the user’s physical vertical movements to movements in the virtual to inspect bridges and buildings, and in such situations, dynamic space and the remote space. This approach allows natural and more vertical movement and precise movement are important [12]. active physical control of a real drone. To demonstrate the usability However, due to their ability to move in 3D space, drones are of vertical gains, we implemented a telepresence drone and vertical typically harder to operate than their 2D counterparts such as mobile redirection techniques for stretching and crouching actions using robots. Currently, many drones are controlled by handheld devices, common VR devices. We conducted two user studies to investigate especially joysticks, while the user wears a virtual reality (VR) head- the effective manipulation ranges and its usability: one study using mounted display (HMD) in the local environment (LE) to see the a virtual environment (VE), and one using a camera stream from a remote environment (RE). Such devices are not easy to operate, and telepresence drone. The results revealed that our technique could it requires a longer training before users can precisely control and manipulate a users vertical movement without her/his noticing. steer a drone. Keywords: In previous studies, methods of manipulating the drone using Drone, Vertical movement, Redirection, Telepresence the user’s body movement were introduced and demonstrated their Index Terms: H.5.1 [Multimedia Information Systems]: Artificial, effectiveness [7,9]. Although, real body movements provide users augmented, and virtual realities— with a high subjective sense of presence in a remote place, ascent and descent in flight are constrained by the range of the human *e-mail: [email protected] vertical movement; therefore, one of our goal was to control the †e-mail: [email protected] drone with only small body movements while maintaining a natural ‡e-mail: [email protected] sense of movement. On the other hand, high precision movements §e-mail: [email protected] may be required on-site, such as safety inspections. Hence, another our goal was to control the movement of the drone more finely than our actual body movement. It is also important to avoid VR sickness when performing those operations. As a technique to manipulate the movement of a telepresence 2642-5254/20/$31.00 ©2020 IEEE 101 DOI 10.1109/VR46266.2020.00-76 robot while maintaining natural user movements, redirected walking (RDW) is proposed [24, 33, 34]. In the field of RDW, manipulations in horizontal directions have been in focus, e.g., translation, rotation, curvature, and bending gains [16,23,27,28]. However, for operating drones, the vertical movement is fundamentally important, and few research projects in the RDW field addressed vertical movements [6, 21, 22]. In this paper, we explore vertical gains, a novel redirection tech- nique, which enables us to purposefully manipulate the mapping of the user’s physical vertical movements to movements in the virtual space. This is intended to allow natural and more active physical control of a drone. With this technique, the users physical movement distances while stretching and crouching in VR are intentionally Figure 2: Overview of experiment 1: (left) a user wears a virtual reality expanded or contracted by the system using certain gains that deter- head-mounted display (HMD) and crouches in the local environment mine the mapping between physical and virtual/remote movements. (LE), (center) another user wears a virtual reality HMD and stretches To demonstrate the possibilities afforded by vertical gains, we im- walking in the LE, and (right) the user’s view of the VE on the HMD. plemented a telepresence drone and vertical redirection techniques for stretching and crouching actions using standard VR devices. We conducted two user studies to investigate the manipulation ranges perform yaw rotation and translational movement at the same time, where users do not notify the manipulation. Although, we could also the posture may become uneasy. In this way, the freedom of drones use noticeable manipulations, these might have side effects such as is limited to some extent. Commercial drones are often used for VR sickness. Hence, there is a need for detection thresholds of verti- photographs and entertainment but are sometimes used for education cal movement manipulation. The results revealed that our methods [15], haptic, and tactile presentation [1, 2], and telepresence systems could manipulate a users vertical movement with a gain larger than [7]. The most popular way to control drone is by using a controller the conventional translation gain in the fore-aft direction without with joysticks, but there are other methods for manipulating the her/his noticing. That means that the drone would perform more drone by gaze [5] or head movement [7, 9]. intense vertical movement than the user in the real world without noticing the manipulation. 2.2 Telepresence drone systems In this paper, we focus on two basic vertical movements, i.e., (i) A telepresence drone can conduct a wide range of tasks including stretching and (ii) crouching. We investigate how much manipulation telecommunications [3, 31], remote operations [30], entertainment is noticeable in (1) a virtual environment (VE) and (2) in a RE of [8], sports [10], and safety inspections [12]. In such a system, not a telepresence drone system. The reason for investigating the VE only the presentation of RE but also natural and intuitive operation in addition the RE is that it is known that depth perception in VEs is important. Several studies have examined user movement as a and real space significantly differ [11]. Further differences can be potential input for interaction with telepresence drones. Higuchi and found in different viewing angles and latency. The results of these Rekimoto presented Flying Head, which synchronizes the user’s experiments provide the basis for future immersive telepresence head motions with the movements of a drone, and can be easily drone systems in which users can naturally elevate to explore remote manipulated with motions such as walking, looking around, and places using a small movement while maintaining a natural sense of crouching [7, 9]. They tried one-to-one-mapping, which means movement. that the amount of drone movement is the same amount of user The contributions of this work are summarized in the following movement, and one-to-two-mapping, which means if the user moves three points: 1 m, the drone moves 2 m, but in their work did not care whether the • We proposed a vertical gain for various vertical movements user is aware of these operations. such as stretching and crouching. 2.3 Redirected Walking • We measured detection thresholds for vertical gains in a VE Redirected walking (RDW) is a technique that manipulates a users and RE. spatial perception of the horizontal direction by viewpoint manipu- • We investigated the effects of latency and viewing angle on lation [24]. By using this technique, it is possible to compress vast vertical gains by comparing the results obtained with VE and VEs into a smaller real space. One of the most fundamental RDW RE. techniques is translation gain. Translation gain gT is defined as follow 2RELATED WORK dvirtual gT = (1) In considering the introduction of redirection to the telepresence dreal drone system, in this section, we describe drones, telepresence sys- where translation movement distance is d and simulated trans- tems, and redirected walking. real lation movement distance dvirtual. For example, when a user is = 2.1 Drones walking forward 1 m with gain gT 2, the user experiences 2 m of forwarding movement in the VE. Unmanned aerial vehicles (UAVs) or drones have been developed In the RDW field, a lot of research projects have been done for military use, but in recent years various commercial drones have on horizontal movement [16, 17, 20, 29]; however, there are few become popular.
Recommended publications
  • The Design of Social Drones a Review of Studies on Autonomous Flyers in Inhabited Environments
    CHI 2019 Paper CHI 2019, May 4–9, 2019, Glasgow, Scotland, UK The Design of Social Drones A Review of Studies on Autonomous Flyers in Inhabited Environments Mehmet Aydın Baytaş Damla Çay Yuchong Zhang Koç University KUAR Chalmers University of Technology Istanbul, Turkey Koç University Gothenburg, Sweden Qualisys AB Istanbul, Turkey [email protected] Gothenburg, Sweden [email protected] [email protected] Mohammad Obaid Asım Evren Yantaç Morten Fjeld UNSW Art & Design KUAR Chalmers University of Technology University of New South Wales Koç University Gothenburg, Sweden Sydney, Australia Istanbul, Turkey [email protected] [email protected] [email protected] ABSTRACT CCS CONCEPTS The design space of social drones, where autonomous flyers • Human-centered computing → Interaction design; operate in close proximity to human users or bystanders, Empirical studies in interaction design; is distinct from use cases involving a remote human opera- tor and/or an uninhabited environment; and warrants fore- KEYWORDS grounding human-centered design concerns. Recently, re- Drones, social drones, drone design, empirical studies, user search on social drones has followed a trend of rapid growth. studies, design knowledge, human-drone interaction, au- This paper consolidates the current state of the art in human- tonomous agents centered design knowledge about social drones through a ACM Reference Format: review of relevant studies, scaffolded by a descriptive frame- Mehmet Aydın Baytaş, Damla Çay, Yuchong Zhang, Mohammad work of design knowledge creation. Our analysis identified Obaid, Asım Evren Yantaç, and Morten Fjeld. 2019. The Design three high-level themes that sketch out knowledge clusters of Social Drones: A Review of Studies on Autonomous Flyers in in the literature, and twelve design concerns which unpack Inhabited Environments.
    [Show full text]
  • A Comprehensive Review of Applications of Drone Technology in the Mining Industry
    drones Review A Comprehensive Review of Applications of Drone Technology in the Mining Industry Javad Shahmoradi 1, Elaheh Talebi 2, Pedram Roghanchi 1 and Mostafa Hassanalian 3,* 1 Department of Mineral Engineering, New Mexico Tech, Socorro, NM 87801, USA; [email protected] (J.S.); [email protected] (P.R.) 2 Department of Mining Engineering, University of Utah, Salt Lake City, UT 84112, USA; [email protected] 3 Department of Mechanical Engineering, New Mexico Tech, Socorro, NM 87801, USA * Correspondence: [email protected] Received: 4 June 2020; Accepted: 13 July 2020; Published: 15 July 2020 Abstract: This paper aims to provide a comprehensive review of the current state of drone technology and its applications in the mining industry. The mining industry has shown increased interest in the use of drones for routine operations. These applications include 3D mapping of the mine environment, ore control, rock discontinuities mapping, postblast rock fragmentation measurements, and tailing stability monitoring, to name a few. The article offers a review of drone types, specifications, and applications of commercially available drones for mining applications. Finally, the research needs for the design and implementation of drones for underground mining applications are discussed. Keywords: drones; remote sensing; surface mining; underground mining; abandoned mining 1. Introduction Drones, including unmanned air vehicles (UAVs) and micro air vehicles (MAVs), have been used for a variety of civilian and military applications and missions. These unmanned flying systems are able to carry different sensors based on the type of their missions, such as acoustic, visual, chemical, and biological sensors. To enhance the performance and efficiency of drones, researchers have focused on the design optimization of drones that has resulted in the development and fabrication of various types of aerial vehicles with diverse capabilities.
    [Show full text]
  • Dronesar: Extending Physical Spaces in Spatial Augmented Reality Using Projection on a Drone
    DroneSAR: Extending Physical Spaces in Spatial Augmented Reality using Projection on a Drone Rajkumar Darbar, Joan Sol Roo, Thibault Lainé, Martin Hachet Inria Bordeaux, France {rajkumar.darbar,joan-sol.roo,thibault.laine,martin.hachet}@inria.fr Figure 1: An example scenario of DroneSAR. (A) A physical house mock-up. (B) A drone is mounted with two white paper panels. (C) The house is augmented using projection, and the main menu composed of a set of virtual tools projected on the drone panel. (D) A user selected the ‘measuring tool’ application using a controller. Then, the user positions the drone at the desired location in the 3D space (i.e., on top of the house) and draws a line shown in blue color on the augmented house to measure its width. Finally, the measured length is displayed on the drone panel. ABSTRACT ACM Reference Format: Spatial Augmented Reality (SAR) transforms real-world objects into Rajkumar Darbar, Joan Sol Roo, Thibault Lainé, Martin Hachet. 2019. Drone- SAR: Extending Physical Spaces in Spatial Augmented Reality using Projec- interactive displays by projecting digital content using video pro- tion on a Drone. In MUM 2019: 18th International Conference on Mobile and jectors. SAR enables co-located collaboration immediately between Ubiquitous Multimedia (MUM 2019), November 26–29, 2019, Pisa, Italy. ACM, multiple viewers without the need to wear any special glasses. Un- New York, NY, USA, 7 pages. https://doi.org/10.1145/3365610.3365631 fortunately, one major limitation of SAR is that visual content can only be projected onto its physical supports. As a result, displaying 1 INTRODUCTION User Interfaces (UI) widgets such as menus and pop-up windows in SAR is very challenging.
    [Show full text]
  • POD: a Smartphone That Flies
    POD: A Smartphone That Flies Guojun Chen Noah Weiner Lin Zhong Yale University Yale University Yale University [email protected] [email protected] [email protected] ABSTRACT We present POD, a smartphone that flies, as a new way to achieve hands-free, eyes-up mobile computing. Unlike exist- ing drone-carried user interfaces, POD features a smartphone- sized display and the computing and sensing power of a modern smartphone. We share our experience in prototyping POD, discuss the technical challenges facing it, and describe early results toward addressing them. Figure 1: POD (top left corner) interacting with our test mannequin mounted on a Wi-Fi-controlled robotic chassis. 1. INTRODUCTION Our driving vision is hands-free, eyes-up mobile comput- human-drone interaction and requires new investigation. ing. That is, mobile users are able to interact with a computer To address these challenges and experiment with POD, without holding a device in hand and looking down at it. The we have implemented a prototype. As shown in Figure 1, key to this vision is a user interface (both input and output) it is based on a mechanical, electrical and computational that does not require a human hand. Wearable devices, such partnership between a micro-controller-based drone and a as Google Glass, are perhaps the most studied implementa- stripped-down smartphone (Google Pixel 4). It has a flight tion of this type of mobile, hands-free user interface. time of four minutes, uses computer vision to track and follow This paper explores an alternative idea for these new user its user, and harnesses the phone’s sensors to improve the interfaces: one carried by a small drone.
    [Show full text]
  • UBIACTION 2020 Florian Lang Pascal Knierim Albrecht Schmidt
    UBIACTION 2020 Edited by Florian Lang, Pascal Knierim, Jakob Karolus, Fiona Draxler, Ville Mäkelä, Francesco Chiossi, Luke Haliburton, Matthias Hoppe, Albrecht Schmidt UBIACTION 2020 5th Seminar on Ubiquitous Interaction January 27, 2020, Munich, Germany Edited by Florian Lang Pascal Knierim Albrecht Schmidt UBIACTION 2020 – Vol. 5 Vorwort Editors Florian Lang, Pascal Knierim, Jakob Karolus, Fiona Draxler, Ville Mäkelä, In der heutigen Gesellschaft ist die Allgegenwärtigkeit von Computern nahe- Francesco Chiossi, Luke Haliburton, Matthias Hoppe, Albrecht Schmidt zu unabdingbar und die Digitalisierung hat massive Auswirkungen auf die verschiedenen Seiten unseres Lebens und Arbeitens. So wandelt sich die Inter- Human-Centered Ubiquitous Media Group aktion mit unserer Umgebung zunehmend von der Mensch-Mensch-Interaktion Institut für Informatik zur Mensch-Computer-Interaktion und Kommunikation mit anderen Menschen Ludwig-Maxmilians-Universität München [email protected] findet zunehmend durch digitale Technologien statt. Dieser Wandel eröffnet vor Allem in der Forschung die Tore zu neuen Möglichkeiten der Interaktion zwi- schen Menschen und Computern. Das als „Ubiquitous Computing“ bezeichnete Forschungsgebiet untersucht unter anderem die Zugänglichkeit und Einfachheit der Interaktion mit mobilen und tragbaren Geräten und deren Einfluss auf ACM Classification 1998 - H.5 INFORMATION INTERFACES AND PRESENTATION die Erfahrungen des Menschen. Am Lehrstuhl „Human-Centered Ubiquitous Media“ der Ludwig-Maximilians-Universität München forschen Studierende kontinuierlich in diesem Gebiet. Im Rahmen des Seminars „Menschenzentrier- te Interaktion mit ubiquitären Computersystemen“ fertigen die Studierende ISBN-13: 979-8635125052 wissenschaftliche Arbeiten, um einen Einblick in zukünftige Möglichkeiten der Mensch-Computer-Interaktion zu gewähren. Auf der Veranstaltung „Ubiaction“ wurden am 27.01.2020 die Forschungsergebnisse des Seminars in Form von 15-minütigen Vorträgen präsentiert.
    [Show full text]
  • Towards Investigating the Design Space of Data Physicalisation
    Graduation thesis submitted in partial fulfilment of the requirements for the degree of Master of Science in Applied Sciences and Engineering: Applied Computer Science TOWARDS INVESTIGATING THE DESIGN SPACE OF DATA PHYSICALISATION AHMED KAREEM A ABDULLAH Academic year 2017–2018 Promoter: Prof. Dr. Beat Signer Advisor: Payam Ebrahimi Faculty of Engineering c Vrije Universiteit Brussel, all rights reserved. i Abstract Data physicalisation is a new research area that spans Information Visual- isation (InfoVis) and Scientific Visualisation (SciVis) [117]. However, there are many other disciplines which have been actively researching for novel methods to represent and explore data in order to keep up with the growth in data and its forms. For instance, the field of human-computer interaction endeavours finding new forms of interaction with data beyond the traditional mice and keyboards. For instance, researchers in the field of Tangible User Interfaces (TUIs), have been focusing on coupling the physical and digital domains into one coherent system. Eventually, this provides users an im- mersive experience while exploring data by affording direct interactions with the underlying models through common physical artefacts. Nevertheless, the research interests provide new aspirations to find interaction terms beyond the static physicalisation in tangibles. We also have some visionary work such as the Radical Atoms proposed by Ishii et al. [49] promising a new hori- zon of interactions considering the possibilities allowed by the programmable matter, which is a dynamically reconfigurable material of the future. The WISE lab is also involved in the design-driven research towards enabling programmable matter. Researchers in the WISE Lab explored the idea of simulating programmable matter and investigated some important concerns to the development of a system with such a capacity.
    [Show full text]
  • Department Magazine SH 2020
    MAGAZINE T E C H S C I E NFLIPBOOKC E B y t h e I T D e p a r t m e n t S H 2 0 2 0 The Journey of Apple, Sixth Sense Tech, Cybercrime, AND MUCH MORE! LATEST NEWS! Everything you need to know in Boston Dynamics wows one Magazine. us again, SolarWinds hacked - know the whole story and its SCIENTIFIC circumstances! FEATS! Asteroid Treasures and Detecting Galaxy Collisions for the very first time and it doesn't stop there! Check it out! No more needles for diagnostics and the Zodiac Cipher solved! TechScience Edition Tribute Look inside to know more. Mathematician John F. Nash Jr. 7 E V xt SC en od sio e Py n E th s f nt o or hu n In sia si st de s ! EDITORS: FACULTY ADVISORS: ARFAH UPADE, VAISHNAVI MANTRI, PAVITRA VENKATRAMAN, RABIYA IDRISHI DR. LAKSHMISUDHA HOD/IT DESIGNERS: MALVIKA SELVAN, AFTAAB SHEIKH PROF. BUSHRA SHAIKH CONTENT WRITERS: S. ANANTHASELVI, VANITHA REDDY, MELVINA MICHAEL, SAARABI PARKAR, ATHARVA DESHPANDE, SHUBHADARSHINI NADAR T E C H S C I E N C E Department of Information Technology B y t h e I T D e p a r t m e n t S H 2 0 2 0 Vision To develop IT professionals for accomplishment of industrial & societal needs through quality education. The Journey of Apple, Sixth Sense Tech, Cybercrime, AND MUCH Mission MORE! LATEST NEWS! Everything you need to know in Boston Dynamics wows 1. To impart advanced knowledge and develop skills in Information Tech- one Magazine.
    [Show full text]
  • China's Aviation Industry: Lumbering Forward •
    China’s Aviation Industry: Lumbering Forward • A CASI Monograph Prepared by TextOre, Inc. Peter Wood & Robert Stewart, Ph.D Printed in the United States of America by the China Aerospace Studies Institute ISBN: 9781082740404 To request additional copies, please direct inquiries to Director, China Aerospace Studies Institute, Air University, 55 Lemay Plaza, Montgomery, AL 36112 Design by Heisey-Grove Design Cover photo by Dr. Brendan S. Mulvaney, China International Aviation & Aerospace Exhibition, 2018 All photos licensed under the Creative Commons Attribution-Share Alike 4.0 International license, or under the Fair Use Doctrine under Section 107 of the Copyright Act for nonprofit educational and noncommercial use. All other graphics created by or for China Aerospace Studies Institute E-mail: [email protected] Web: http://www.airuniversity.af.mil/CASI Twitter: https://twitter.com/CASI_Research | @CASI_Research Facebook: https://www.facebook.com/CASI.Research.Org LinkedIn: https://www.linkedin.com/company/11049011 Disclaimer The views expressed in this academic research paper are those of the authors and do not necessarily reflect the official policy or position of the U.S. Government or the Department of Defense. In accordance with Air Force Instruction 51-303, Intellectual Property, Patents, Patent Related Matters, Trademarks and Copyrights; this work is the property of the U.S. Government. Limited Print and Electronic Distribution Rights Reproduction and printing is subject to the Copyright Act of 1976 and applicable treaties of the United States. This document and trademark(s) contained herein are protected by law. This publication is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited.
    [Show full text]
  • Extending Physical Spaces in Spatial Augmented Reality Using Projection on a Drone Rajkumar Darbar, Joan Roo, Thibault Lainé, Martin Hachet
    DroneSAR: Extending Physical Spaces in Spatial Augmented Reality using Projection on a Drone Rajkumar Darbar, Joan Roo, Thibault Lainé, Martin Hachet To cite this version: Rajkumar Darbar, Joan Roo, Thibault Lainé, Martin Hachet. DroneSAR: Extending Physical Spaces in Spatial Augmented Reality using Projection on a Drone. MUM’19 - 18th International Confer- ence on Mobile and Ubiquitous Multimedia, Nov 2019, Pisa, Italy. 10.1145/3365610.3365631. hal- 02409351 HAL Id: hal-02409351 https://hal.inria.fr/hal-02409351 Submitted on 13 Dec 2019 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. DroneSAR: Extending Physical Spaces in Spatial Augmented Reality using Projection on a Drone Rajkumar Darbar, Joan Sol Roo, Thibault Lainé, Martin Hachet Inria Bordeaux, France {rajkumar.darbar,joan-sol.roo,thibault.laine,martin.hachet}@inria.fr Figure 1: An example scenario of DroneSAR. (A) A physical house mock-up. (B) A drone is mounted with two white paper panels. (C) The house is augmented using projection, and the main menu composed of a set of virtual tools projected on the drone panel. (D) A user selected the ‘measuring tool’ application using a controller. Then, the user positions the drone at the desired location in the 3D space (i.e., on top of the house) and draws a line shown in blue color on the augmented house to measure its width.
    [Show full text]
  • Drone Warfare: War in the Age of Mechanical
    In this paper I explore the scopic regime of drone warfare as the production of the image as a site of meaning. The first part of the paper I describe what a drone is, through its technical specifications and through detailed reports on actual drone attacks in the recent ‘War on Terror.’ I highlight some of the contemporary debates surrounding its use and how drone technology has transformed the mechanization of war. In the second part of the paper I relate drone technology to a historical framework, specifically referring to the work of Walter Benjamin and his seminal essay, ”The Work of Art in the Age of Mechanical Reproduction.” This moves through a discussion of the relationship between vision and war during the First World War and political implications of representations of mechanized warfare during the aesthetic movement of the Futurists. The final section of the paper approaches specific aspects of drone technology that depart from mechanized warfare into a digital realm. These aspects connect to the development of artificial visual intelligence programs and the primacy of visual pattern recognition being increasingly utilized in drone surveillance. I highlight concepts in the work of Paul Virilio in his book, “The Vision Machine” such as telepresence and the industrialization of vision, in examining the contemporary implications of drone technology. Figure 1. Profilo Continuo. Renato Bertelli. 1933 28 Figure 2. Roman God of Janus 29 Figure 3. Continuous Profile (of George W. Bush) Julian LaVerdiere. 2004 29 Figure 4. Tribute in Light Memorial. Julian LaVerdiere. 2002 30 Figure 5. Cathedral of Light. Albert Speer.
    [Show full text]
  • समाचार प से च यत अंश Newspapers Clippings
    Feb 2021 समाचार प�� से च�यत अंश Newspapers Clippings A Daily service to keep DRDO Fraternity abreast with DRDO Technologies, Defence Technologies, Defence Policies, International Relations and Science & Technology खंड : 46 अंक : 27 06-08 फरवर� 2021 Vol.: 46 Issue : 27 06-08 February 2021 र�ा �व�ान पु�तकालय Defenceर�ा �व�ान Science पु�तकालय Library र�ा Defenceवै�ा�नक स Scienceूचना एवं �लेखन Library क� � Defence Scientificर�ा Informationवै�ा�नक सूचना &एवं Documentation �लेखन क� � Centre Defence Scientificमेटकॉफ Information हाउस, �द�ल� &- Documentation110 054 Centre Metcalfeमेटकॉफ House,हाउस, �द�ल� Delhi ­- 110110 054 054 Metcalfe House, Delhi­ 110 054 CONTENTS S. No. TITLE Page No. DRDO News 1-25 DRDO Technology News 1-25 1. DRDO hands over Licensing Agreements for Transfer of Technology for 14 1 technologies to 20 industries 2. डीआरडीओ ने 20 उ�योग� को 14 �ौ�यो�ग�कय� के �लए �ौ�यो�गक� ह�तांतरण के �लए 2 लाइस� संग समझौते स�पे 3. DRDE develop standard for Nuclear, Biological & chemical war protective 3 clothing 4. DFRL unveils food products for forces, Gaganyaan astronauts 4 5. Historic defence order of 118 indigenous Arjun 1A tanks on the anvil 5 6. नेवी के �लए बन रहा है डबल इंजन एयर�ा�ट, �मग-29K क� लेगा जगह 6 7. BDL launches new products at Aero India-2021 7 8. DIAT to research, develop robotics platforms for armed forces 8 9. Uttarakhand floods: Glacier bursts in winter 'next to impossible' say defence 9 scientists 10.
    [Show full text]
  • Design of a Tangible Drone Swarm As a Programmable Matter Interface
    BitDrones: Design of a Tangible Drone Swarm as a Programmable Matter Interface By John Calvin Rubens A thesis submitted to the Graduate Program in Electrical and Computer Engineering in conformity with the requirements for the Degree of Master of Applied Science Queen’s University Kingston, Ontario, Canada September 2019 Copyright © John Calvin Rubens, 2019 Abstract This thesis details the design and functionality of BitDrones, a programmable matter interface (PMI) composed of micro cuboid drones. Each self-levitating BitDrone represented a tangible voxel and possessed four degrees of freedom to position itself in three dimensions and control its orientation about the z-axis. A single BitDrone consisted of a custom designed quadcopter suspended inside a carbon fiber wireframe cube with onboard RGB LEDs for illumination. As a swarm, these drones formed a human-computer interface that could be physically manipulated via manual user input as a Real Reality Interface (RRI). RRIs render interactive digital experiences via the manipulation of physical matter but faced several functional limitations until now. Historically, RRI elements were not self-levitating and could only create structurally stable models, or such elements had self-levitating capabilities but were not tangible and could only create sparse three-dimensional models. The spatial independence, tangibility and self-motility of each BitDrone voxel granted more versatility than previous RRIs, and this versatility enabled BitDrones to meet the functional criteria of a PMI as defined in the Human-Computer Interaction (HCI) literature. This work presents the evolving design of the BitDrones, the key components of the computational architecture that governed the system, as well as the tangible interactions, tools and controllers designed to use the BitDrones display.
    [Show full text]