Towards Autonomous Navigation of Miniature UAV

Total Page:16

File Type:pdf, Size:1020Kb

Towards Autonomous Navigation of Miniature UAV Towards autonomous navigation of miniature UAV Roland Brockers Martin Humenberger Stephan Weiss Jet Propulsion Laboratory Austrian Institute of Technology Jet Propulsion Laboratory [email protected] [email protected] [email protected] Larry Matthies Jet Propulsion Laboratory [email protected] Abstract Micro air vehicles such as miniature rotorcrafts require high-precision and fast localization updates for their con- trol, but cannot carry large payloads. Therefore, only small and light-weight sensors and processing units can be de- ployed on such platforms, favoring vision-based solutions that use light weight cameras and run on small embedded computing platforms. In this paper, we propose a navi- gation framework to provide a small quadrotor UAV with accurate state estimation for high speed control including Figure 1. Asctec Hummingbird with Odroid-U2 flight computer 6DoF pose and sensor self-calibration. Our method al- mounted on top. lows very fast deployment without prior calibration proce- dures literally rendering the vehicle a throw-and-go plat- get. Consequently, only light-weight sensors and process- form. Additionally, we demonstrate hazard-avoiding au- ing units can be used on such platforms, favoring vision- tonomous landing to showcase a high-level navigation ca- based pose estimation solutions that use small light-weight pability that relies on the low-level pose estimation results cameras and MEMS (microelectromechanical systems) in- and is executed on the same embedded platform. We explain ertial sensors. As recent developments in multi-core smart- our hardware-specific implementation on a 12g processing phone processors are driven by the same size, weight, and unit and show real-world end-to-end results. power (SWaP) constraints, micro aerial vehicles (MAVs) can directly benefit from new products in this area that provide more computational resources at lower power bud- 1. Introduction gets and low weight, enabling miniaturization of aerial plat- forms that are able to perform navigation tasks fully au- Miniature rotorcraft platforms have several advantages tonomously. Additionally, novel algorithmic implementa- in exploration and reconnaissance missions since they can tions with minimal computational complexity, such as pre- operate in highly cluttered environments (forest, close to the sented in this paper, are required. ground) or confined spaces (indoors, collapsed buildings, Once pose estimation is available, higher level au- caves) and offer stealth from their small size. But in order tonomous navigation tasks which leverage and require this to operate these platforms safely, fast and accurate pose esti- information can be executed. Examples for such tasks are: mation that is independent from external sensors (e.g. GPS) autonomous landing, ingress, surveillance, exploration, and is needed for control. other. Autonomous landing is especially important not only Literature has shown that a viable solution for GPS in- for safety reasons, but also for mission endurance. Small dependent pose estimation is to use visual and inertial sen- rotorcrafts inherently suffer from overall short mission en- sors [6, 11]. However, a major algorithmic challenge is to durance, since payload restrictions do not allow carrying process sensor information at high rate to provide vehicle large batteries. For surveillance or exploration tasks, en- control and higher level tasks with real-time position infor- durance can be greatly improved by not requiring the plat- mation and vehicle states. Since micro rotorcrafts can only form to be airborne at all time. Instead, such tasks may carry a few grams of payload including batteries, this has to even favor a steady quiet observer at a strategic location be accomplished with a very small weight and power bud- (e.g. high vantage points like rooftops or on top of telephone 631 poles) - still with the ability to move if required - which also s s s qi pi Y could include re-charging while in sleep mode (e.g. from i i X Y s i solar cells). Such a capability allows long term missions, w X Z i Z p w but introduces additional constraints on the algorithm side: i Zs q w ps First, to eliminate memory and processing power issues, all qs w w w X w algorithms need to run at constant complexity with a con- Y stant maximal memory footprint, independently of mission length and area, which is particularly true for the pose es- Figure 2. Setup illustrating the robot body with its sensors with timation module. Second, time and impacts with the envi- respect to a world reference frame. The system state vector is X = {pi vi qi b b λps qs} pi qi ronment may change sensor extrinsics. To ensure long-term w w w ω a i i and w and w denote the robot’s operation, the pose estimator not only has to provide accu- position and attitude in the world frame. rate pose information but also needs to continuously self- calibrate the system. Third, since a base station may not for simultaneously estimating pose information and system be within communication reach, all algorithms have to be calibration parameters (such as IMU biases and sensor ex- self-contained and executed on-board the vehicle. trinsics). This self-calibrating aspect eliminates the need In this paper, we introduce a novel implementation of of pre-launch calibration procedures and renders the system a fully self-contained, self-calibrating navigation frame- power-on-and-go work on an embedded, very small form-factor flight com- In our current framework, we propose a hierarchical ar- puter (12grams) that allows to autonomously control a small chitecture for our vision-front-end with a downward look- rotorcraft UAV (Figure 1) in GPS denied environments. ing camera. Using the camera as a velocity sensor based on Our framework enables quick deployment (throw-and-go), inertial-optical flow (IOF), allows to estimate full attitude and performs autonomous landing on elevated surfaces and drift-free metric distance to the overflown terrain [22], as a high-level navigation task example, which we chose which we use for rapid deployment and emergency maneu- because of three reasons: 1) Landing requires accurate vers. For accurate position estimation, we use a keyframe- and continuous pose estimation over large scale changes based visual self localization and mapping (VSLAM) strat- (i.e. from altitude to touch down). Thus, it implicitly shows egy to render the camera into a 6DoF position sensor. The the capabilities of our underlying estimator. 2) It demon- VSLAM approach is based on [7] but tailored to run on our strates how the information of our robust low-level pose es- embedded architecture with constant computational com- timator can be further leveraged and integrated to improve plexity by deploying a sliding window, local mapping ap- subsequent high level tasks. 3) The computational complex- proach [20]. Whereas our velocity based IOF approach ity of 3D reconstruction and landing site detection needs to drifts parallel to the terrain our VSLAM approach is locally be tractable on such low SWaP devices. drift free and better suited for long-term MAV navigation The remainder of the paper is organized as follows: Sec- in large outdoor environments – only limited by the battery tion 2 gives an overview of our vision-based pose estima- life-time and not by processing power nor memory. But it tion framework including related work and a short introduc- requires specific motion for initialization. Since the IOF tion of its self-calibration and the underlying filter approach. based approach does not need any particular initialization, Section 3 describes our autonomous landing algorithm, and it can be used for very quick deployment of the MAV – even details our approach for 3D reconstruction and landing site by simply throwing it in the air. Thus, for rapid deployment, detection. Section 4 introduces the on-board implementa- the thrown MAV will quickly stabilize itself with our IOF tion, provides a performance evaluation, and presents ex- approach, and then autonomously initialize our VSLAM ap- perimental results, whereas Section 5 concludes the paper. proach as shown in [21] – rendering it a throw-and-go GPS independent system. 2. GPS Independent Pose Estimation Both, the pose of the VSLAM algorithm as well as the output of the IOF, are fused with the measurements of an Autonomous flight in unknown environments excludes IMU using an Extended Kalman Filter (EKF). The EKF the use of motion capture systems, and using GPS is not prediction and update steps are distributed among differ- always an option since it may be unavailable due to ef- ent processing units of the MAV because of their compu- fects such as shadowing or multipath propagation in city- tational requirements as described in [19]. The state of the i like environments. Therefore, commonly used sensors for filter is composed of the position pw, the attitude quater- i i pose estimation are stereo [8] and monocular cameras [20] nion qw, the velocity vw of the IMU in the world frame, as well as laser scanners [16]. Since heavy sensors cannot the gyroscope and accelerometer biases bω and ba, and the be used on low SWaP platforms, monocular, visual-inertial metric scale factor λ. Additionally, the extrinsic calibra- s pose estimators might be the most viable choice for MAVs. tion parameters describing the relative rotation qi and po- s In our previous work, we demonstrated that pose estima- sition pi between the IMU and the camera frames were tion based on the fusion of visual information from a single also added. This yields a 24-element state vector X = iT iT iT T T s s camera and inertial measurements from an IMU can be used {pw vw qw bω ba λpi qi }. Figure 2 depicts the setup with 632 Framelist Feature extraction & matching Landing map latest input Spatial analysis image if |T|>b (Does the vehicle fit?) distortion s , p removal 00 Rectification Surface analysis s-1 , p -1 (Surface flat enough and Calculate Waypoints no obstacles?) waypoint 1 s-2 , p -2 Stereo Proc.
Recommended publications
  • Mini Unmanned Aerial Systems (UAV) - a Review of the Parameters for Classification of a Mini AU V
    International Journal of Aviation, Aeronautics, and Aerospace Volume 7 Issue 3 Article 5 2020 Mini Unmanned Aerial Systems (UAV) - A Review of the Parameters for Classification of a Mini AU V. Ramesh PS Lovely Professional University, [email protected] Muruga Lal Jeyan Lovely professional university, [email protected] Follow this and additional works at: https://commons.erau.edu/ijaaa Part of the Aeronautical Vehicles Commons Scholarly Commons Citation PS, R., & Jeyan, M. L. (2020). Mini Unmanned Aerial Systems (UAV) - A Review of the Parameters for Classification of a Mini AU V.. International Journal of Aviation, Aeronautics, and Aerospace, 7(3). https://doi.org/10.15394/ijaaa.2020.1503 This Literature Review is brought to you for free and open access by the Journals at Scholarly Commons. It has been accepted for inclusion in International Journal of Aviation, Aeronautics, and Aerospace by an authorized administrator of Scholarly Commons. For more information, please contact [email protected]. PS and Jeyan: Parameters for Classification of a Mini UAV. The advent of Unmanned Aerial Vehicle (UAV) has redefined the battle space due to the ability to perform tasks which are categorised as dull, dirty, and dangerous. UAVs re-designated as Unmanned Aerial Systems (UAS) are now being developed to provide cost effective efficient solutions for specific applications, both in the spectrum of military and civilian usage. US Office of the Secretary of Defense (2013) describes UAS as a “system whose components include the necessary equipment, network, and personnel to control an unmanned aircraft.” In an earlier paper, US Office of the Secretary of Defense (2005) specifies UAV as the airborne element of the UAS and defines UAV as “A powered, aerial vehicle that does not carry a human operator, uses aerodynamic forces to provide vehicle lift, can fly autonomously or be piloted remotely, can be expendable or recoverable, and can carry a lethal or non-lethal payload.” John (2010) provided an excellent historical perspective about the evolution of the UAVs.
    [Show full text]
  • Design and Autonomous Stabilization of a Ballistically-Launched Multirotor
    Design and Autonomous Stabilization of a Ballistically-Launched Multirotor Amanda Bouman1, Paul Nadan2, Matthew Anderson3, Daniel Pastor1, Jacob Izraelevitz3, Joel Burdick1, and Brett Kennedy3 Abstract—Aircraft that can launch ballistically and con- vert to autonomous, free-flying drones have applications in many areas such as emergency response, defense, and space exploration, where they can gather critical situa- tional data using onboard sensors. This paper presents a ballistically-launched, autonomously-stabilizing multirotor pro- totype (SQUID - Streamlined Quick Unfolding Investigation Drone) with an onboard sensor suite, autonomy pipeline, and passive aerodynamic stability. We demonstrate autonomous transition from passive to vision-based, active stabilization, confirming the multirotor’s ability to autonomously stabilize after a ballistic launch in a GPS-denied environment. I. INTRODUCTION Unmanned fixed-wing and multirotor aircraft are usually launched manually by an attentive human operator. Aerial systems that can instead be launched ballistically without op- erator intervention will play an important role in emergency response, defense, and space exploration where situational awareness is often required, but the ability to conventionally launch aircraft to gather this information is not available. Firefighters responding to massive and fast-moving fires could benefit from the ability to quickly launch drones Fig. 1: Launching SQUID: inside the launcher tube (left), through the forest canopy from a moving vehicle. This eye- deploying the arms and fins (center), and fully-deployed in-the-sky could provide valuable information on the status configuration (right). Note the slack in the development safety of burning structures, fire fronts, and safe paths for rapid tether and how the carriage assembly remains in the tube retreat.
    [Show full text]
  • Towards an Autonomous Vision-Based Unmanned Aerial System Against Wildlife Poachers
    OLIVARES-MENDEZ, M.A., FU, C., LUDIVIG, P., BISSYANDÉ, T.F., KANNAN, S., ZURAD, M., ANNAIYAN, A., VOOS, H. and CAMPOY, P. 2015. Towards an autonomous vision-based unmanned aerial system against wildlife poachers. Sensors [online], 15(12), pages 31362-31391. Available from: https://doi.org/10.3390/s151229861 Towards an autonomous vision-based unmanned aerial system against wildlife poachers. OLIVARES-MENDEZ, M.A., FU, C., LUDIVIG, P., BISSYANDÉ, T.F., KANNAN, S., ZURAD, M., ANNAIYAN, A., VOOS, H. and CAMPOY, P. 2015 This document was downloaded from https://openair.rgu.ac.uk Article Towards an Autonomous Vision-Based Unmanned Aerial System against Wildlife Poachers Miguel A. Olivares-Mendez 1,∗, Changhong Fu 2, Philippe Ludivig 1, Tegawendé F. Bissyandé 1, Somasundar Kannan 1, Maciej Zurad 1, Arun Annaiyan 1, Holger Voos 1 and Pascual Campoy 2 Received: 28 September 2015; Accepted: 2 December 2015; Published: 12 December 2015 Academic Editor: Felipe Gonzalez Toro 1 Interdisciplinary Centre for Security, Reliability and Trust, SnT - University of Luxembourg, 4 Rue Alphonse Weicker, L-2721 Luxembourg, Luxembourg; [email protected] (P.L.); [email protected] (T.F.B.); [email protected] (S.K.); [email protected] (M.Z.); [email protected] (A.A.); [email protected] (H.V.) 2 Centre for Automation and Robotics (CAR), Universidad Politécnica de Madrid (UPM-CSIC), Calle de José Gutiérrez Abascal 2, 28006 Madrid, Spain; [email protected] (C.F.); [email protected] (P.C.) * Corresponce: [email protected]; Tel.: +352-46-66-44-5478; Fax: +352-46-66-44-35478 Abstract: Poaching is an illegal activity that remains out of control in many countries.
    [Show full text]
  • Design, Construction and Control of a Large Quadrotor Micro Air Vehicle
    Design, Construction and Control of a Large Quadrotor Micro Air Vehicle Paul Edward Ian Pounds September 2007 A thesis submitted for the degree of Doctor of Philosophy of the Australian National University Declaration The work in this thesis is my own except where otherwise stated. Paul Pounds Dedication For the past four years, I have been dogged by a single question. It was never far away - always on the lips of my friends and family, forever in my mind. This question haunted my days, kept me up at night and shadowed my every move. “Does it fly?” Without a doubt, the tenacity of this question played a pivotal role in my thesis. Its persistence and ubiquity beguiled me to go on and on, driving me to finish when I would have liked to have given up. I owe those people who voiced it gratitude for their support and motivation. So, I would like to thank you all by way of answering your question: Yes. Yes, it does. v Acknowledgements This thesis simply could not have happened without the help of my colleagues, many of whom I have come to regard as not just co-workers, but as friends. Foremost thanks to my supervisor, Robert Mahony, for his constant support, friendly ear, helpful advice, paper editing skills and frequent trips to France. He’s been invaluable in keeping me on the straight and true and preventing me from running off on pointless (yet fascinating) tangents. Special thanks to the members of my supervisory panel and research part- ners — Peter Corke, Tarek Hamel and Jongyuk Kim — for fielding questions, explaining maths in detail, reviewing papers and keeping me motivated.
    [Show full text]
  • International Journal for Scientific Research & Development
    IJSRD - International Journal for Scientific Research & Development| Vol. 7, Issue 10, 2019 | ISSN (online): 2321-0613 Analysis of Artificial Intelligence in Future Warfare G. Yashwanth1 S. Jaisharan2 1,2B.Sc Student 1,2Department of Computer Technology 1,2Sri Krishna Adithya College of Arts and Science, Coimbatore, Tamil Nadu-641042, India Abstract— Artificial Intelligence is the machine or software well as in collecting and summarizing information supersets displayed intelligence. Artificial Intelligence (AI) is the from a variety of sources, and this detailed analysis allows simulation by computers, particularly computer systems, of soldiers to spot patterns and draw connections.AI, if correctly human intel-ligence procedures. These procedures include coded, would have a small error rate compared to humans. learning, rea-soning and self-correction. Artificial They'd have amazing precision, precision, and velocity. intelligence involves reasoning, the natural language of They're not going to be influenced by hostile settings, so processing, and even different algorithms are used to bring they're going to be able to finish hazardous assignments, the intelligence into the scheme. As artificial intelligence discover a room, and endure issues that would hurt or kill us. advances into sectors such as healthcare, finance, education Replace people with repetitive, tedious duties and many and social media, countries around the globe are increasingly toiling workplaces. AI can enhance an organization's investing in other fields like automated weapons systems. logistics; it can enhance the pace at which choices are made Artificial Intelligence will be an important part of modern and implemented. warfare. Military systems equipped with AI can handle bigger quantities of information more effectively compared to standard systems.
    [Show full text]
  • Bio-Inspired Autonomous Visual Vertical Control of a Quadrotor UAV
    1 Bio-inspired Autonomous Visual Vertical Control of A Quadrotor UAV Mohamad T. Alkowatly Victor M. Becerra and William Holderbaum. University of Reading, Reading, RG6 6AX, UK Abstract Near ground maneuvers, such as hover, approach and landing, are key elements of autonomy in unmanned aerial vehicles. Such maneuvers have been tackled conventionally by measuring or estimating the velocity and the height above the ground often using ultrasonic or laser range finders. Near ground maneuvers are naturally mastered by flying birds and insects as objects below may be of interest for food or shelter. These animals perform such maneuvers efficiently using only the available vision and vestibular sensory information. In this paper, the time-to-contact (Tau) theory, which conceptualizes the visual strategy with which many species are believed to approach objects, is presented as a solution for Unmanned Aerial Vehicles (UAV) relative ground distance control. The paper shows how such an approach can be visually guided without knowledge of height and velocity relative to the ground. A control scheme that implements the Tau strategy is developed employing only visual information from a monocular camera and an inertial measurement unit. To achieve reliable visual information at a high rate, a novel filtering system is proposed to complement the control system. The proposed system is implemented on-board an experimental quadrotor UAV and shown not only to successfully land and approach ground, but also to enable the user to choose the dynamic characteristics
    [Show full text]
  • UAV Flight Plan 2008
    THE JOINT AIR POWER COMPETENCE CENTRE (JAPCC) FLIGHT PLAN FOR UNMANNED AIRCRAFT SYSTEMS (UAS) IN NATO 10 March 2008 Non Sensitive Information – Releasable to the Public TABLE of CONTENTS Executive Summary 3 1. Introduction 5 2. Current and Projected Capabilities 7 3. What is needed to Fill the Gaps 16 4. Problems and Recommendations 24 Annex A: References A-1 Annex B: NATO Unmanned Aircraft Systems - Operational B-1 B.1: High Altitude Long Endurance NATO UAS B-5 B.2: Medium Altitude Long Endurance NATO UAS B-9 B.3: Tactical NATO UAS B-23 B.4: Sensors for NATO UAS B-67 Annex C: Airspace Management and Command and Control C-1 C.1: European Airspace C-1 C.1.1: EUROCONTROL Specifications for the Use of Military Aerial C-1 Vehicles as Operational Traffic outside Segregated Airspace C.1.2: EASA Airworthiness Certification C-4 C.1.3: Safety Rules suggestions for Small UAS C-5 C.2: ICAO vs. FAA on Airspace Classification C-7 C.3: NATO Air Command and Control Systems in European Airspace C-11 C.4: List of National Laws pertaining to UAS C-12 Annex D: Unmanned Aircraft Systems Missions D-1 Annex E: Acronyms E-1 Annex F: Considerations regarding NATO procurement of its own UAS versus F-1 Individual Nations contributing UAS as they are willing and able 2 Non Sensitive Information – Releasable to the Public EXECUTIVE SUMMARY NATO is in the process of improving their structures and working procedures to better fulfill the requirements of the new international security environment.
    [Show full text]
  • Write the Title of the Thesis Here
    DESIGN AND ANALYSIS OF AN UNMANNED SMALL-SCALE AIR-LAND-WATER VEHICLE ARHAMI A thesis submitted in fulfillment of the requirements for the award of the degree of Doctor of Philosophy Faculty of Mechanical and Manufacturing Engineering Universiti Tun Hussein Onn Malaysia APRIL 2013 ABSTRACT The designs of small-scale unmanned aerial, land and water vehicles are currently being researched in both civilian and military applications. However, these un- manned vehicles are limited to only one and two mode of operation. The problems arise when a mission is required for two and three-mode of operation, it take two or three units of vehicles to support the mission. This thesis presents the design and analysis of an Unmanned Small-scale Air-Land-Water Vehicle (USALWaV). The vehicle named as a “three-mode vehicle”, is aimed to be an innovative vehicle that could fly, move on the land and traversing the water surface for search and rescue, and surveillance missions. The design process starts with an analysis of re- quirements to translate the customer needs using a Quality Function Deployment (QFD) matrix. A design concept was generated by developing an optimal config- uration for the platform using QFD with a non-exhaustive comparison method. A detailed analysis of the concept followed by designing, analyzing and develop- ing main components of the drive train, main frame, main rotor, rotor blades, control system, transmission system, pontoon mechanism and propeller system. Further, the analysis was done to determine the vehicle weight, aerodynamics, and stability using several software; SolidWorks, XFLR5, and Matlab. Then, the model of USALWaV has been fabricated, tested and successfully operates using remote control system.
    [Show full text]
  • Unmanned Ambitions
    Unmanned Ambitions Security implications of growing proliferation in emerging military drone markets www.paxforpeace.nl Colophon juli 2018 PAX means peace. Together with people in conflict areas and concerned citizens worldwide, PAX works to build just and peaceful societies across the globe. PAX brings together people who have the courage to stand for peace. Everyone who believes in peace can contribute. We believe that all these steps, whether small or large, inevitably lead to the greater sum of peace. If you have questions, remarks or comments on this report you can send them to [email protected] See also www.paxforpeace.nl Authors Wim Zwijnenburg and Foeke Postma Editor Elke Schwarz Cover photo 13 Turkish-made Bayraktar TB2 UAVs lined up in formation on a runway in 2017, © Bayhaluk / Wiki media Commmons / CC BY-SA 4.0 Graphic design Frans van der Vleuten Contact [email protected] We are grateful for the help and support of Dan Gettinger, Arthur Michel Holland, Alies Jansen, Frank Slijper, Elke Schwarz, and Rachel Stohl. Armament Research Services (ARES) was commissioned to provide technical content for this report. ARES is an apolitical research organisation supporting a range of governmental, inter-governmental, and non-governmental entities (www.armamentresearch.com) This report was made with the financial support of the Open Society Foundations. 2 PAX ♦ Unmanned Ambitions Contents 1. Executive Summary 4 2. Introduction 6 2.1 Dangerous Developments 6 2.2 Structure 7 3. Drone Capabilities and Markets 8 3.1 Expanding markets 9 3.2 Military market 10 4. Military Drone Developments 13 4.1 Drones on the battlefield 15 4.2 Loitering munitions 16 4.3 Other uses 16 5.
    [Show full text]
  • Does Unmanned Make Unacceptable? Exploring The
    Does Unmanned Make Unacceptable? Exploring the Debate on using Drones and Robots in Warfare Cor Oudes, Wim Zwijnenburg May 2011 Table of Contents ISBN 9789070443672 2 Introduction © IKV Pax Christi March 2011 4 1 Unmanned systems: definitions and developments 4 1.1 Unmanned Aerial Vehicles IKV Pax Christi works for peace, reconciliation and justice in the world. We work 7 1.2 Unmanned Ground Vehicles together with people in war zones to build a peaceful and democratic society. We 8 1.3 Unmanned Underwater Vehicles and Unmanned Surface Vehicles enlist the aid of people in the Netherlands who, like IKV Pax Christi, want to work 10 1.4 Autonomous versus remote controlled for political solutions to crises and armed conflicts. IKV Pax Christi combines knowledge, energy and people to attain one single objective: peace now! 11 2 Unmanned systems: deployment and use 11 2.1 International use of unmanned systems If you have questions, remarks or comments on this report you can send them to 15 2.2 Dutch use of unmanned systems at home and abroad [email protected]. See also: www.ikvpaxchristi.nl. 18 3 Effects and dangers of unmanned systems on a battleground We would like to thank Frank Slijper for his valuable comments and expertise. 18 3.1 Pros and cons of unmanned systems Furthermore, we would like to thank Merijn de Jong, David Nauta, Lambèr 21 3.2 Dehumanising warfare Royakkers and Miriam Struyk. 24 4 Ethical and legal issues and reflections IKV Pax Christi has done its utmost to retrieve the copyright holders of the images 24 4.1 Cultural context: risk-free warfare used in this report.
    [Show full text]
  • Unmanned Aircraft System (UAS) Service Demand 2015 - 2035 Literature Review & Projections of Future Usage
    Unmanned Aircraft System (UAS) Service Demand 2015 - 2035 Literature Review & Projections of Future Usage Technical Report, Version 0.1 — September 2013 DOT-VNTSC-DoD-13-01 Prepared for: United States Air Force Aerospace Management Systems Division, Air Traffic Systems Branch (AFLCMC/HBAG) Hanscom AFB, Bedford, MA Notice This document is disseminated under the sponsorship of the Department of Defense in the interest of information exchange. The United States Government assumes no liability for the contents or use thereof. The United States Government does not endorse products or manufacturers. Trade or manufacturers’ names appear herein solely because they are considered essential to the objective of this report. Cover Page photo source: United States Air Force (www.af.mil) REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188), Washington, DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED September 2013 Initial Technical Report 4. TITLE AND SUBTITLE 5a. FUNDING NUMBERS Unmanned Aircraft System (UAS) Service Demand 2015-2035: Literature Review and Projections of VHD2 Future Usage, Version 0.1 6.
    [Show full text]
  • Modeling the Human Visuo-Motor System for Remote-Control Operation
    Modeling the Human Visuo-Motor System for Remote-Control Operation A DISSERTATION SUBMITTED TO THE FACULTY OF THE GRADUATE SCHOOL OF THE UNIVERSITY OF MINNESOTA BY Jonathan Andersh IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Doctor of Philosophy Advisors: Ber´ enice´ Mettler and Nikolaos Papanikolopoulo May, 2018 ⃝c Jonathan Andersh 2018 ALL RIGHTS RESERVED Acknowledgements I am grateful to all the people who have provided invaluable support, guidance, and assistance during my time at University of Minnesota. First, I would like to thank my thesis advisor, Prof. Ber´ enice´ Mettler. She provided guid- ance, encouragement, and support over the course of my thesis work. The frequent discussions we had on new ideas and different ways to approach research problems helped keep the work moving forward. I would also like to thank my co-advisor Nikolaos Papanikolopoulos for shar- ing his experience and providing support. I want to express my appreciation to my other dissertation committee members, Prof. Volkan Isler and Dr. Vassilios Morellas, for their availability and their time. I would like to thank my colleagues and friends Navid Dadkhah, Zhaodan Kong, and Bin Li, at the Interactive Guidance and Control Lab, and Anoop Cherian from the Center for Distributed Robotics. They made working in the lab enjoyable. I could not ask for a better set of colleagues. Special thanks to the administrative staff of the Department of Conputer Science and Engi- neering, and the Graduate School. Last, I would like to thank my mother to whom I am eternally grateful. She has been a constant source of inspiration, stability, and encouragement.
    [Show full text]