A Navigation System for Indoor/Outdoor Environments

Total Page:16

File Type:pdf, Size:1020Kb

A Navigation System for Indoor/Outdoor Environments A NAVIGATION SYSTEM FOR INDOOR/OUTDOOR ENVIRONMENTS WITH AN UNMANNED GROUND VEHICLE (UGV) by Milad Roigari A Thesis Presented to the Faculty of the American University of Sharjah College of Engineering in Partial Fulfillment of the Requirements for the Degree of Master of Science in Mechatronics Engineering Sharjah, United Arab Emirates May 2015 © 2015 Milad Roigari. All rights reserved. Approval Signatures We, the undersigned, approve the Master’s Thesis of Milad Roigari. Thesis Title: A Navigation System for Indoor/Outdoor Environments with an Unmanned Ground Vehicle (UGV) Signature Date of Signature (dd/mm/yyyy) ___________________________ _______________ Dr. Mohammad Abdel Kareem Rasheed Jaradat Associate Professor, Department of Mechanical Engineering Thesis Advisor ___________________________ _______________ Dr. Mamoun Abdel-Hafez Associate Professor, Department of Mechanical Engineering Thesis Co-Advisor ___________________________ _______________ Dr. Hasan Saeed Mir Associate Professor, Department of Electrical Engineering Thesis Committee Member ___________________________ _______________ Dr. Shayok Mukhopadhyay Assistant Professor, Department of Electrical Engineering Thesis Committee Member ___________________________ _______________ Dr. Mamoun Abdel-Hafez Director, Mechatronics Engineering Graduate Program ___________________________ _______________ Dr. Mohamed El Tarhuni Associate Dean, College of Engineering ___________________________ _______________ Dr. Leland Blank Dean, College of Engineering ___________________________ _______________ Dr. Khaled Assaleh Director of Graduate Studies Acknowledgements I would like to express my gratitude to my supervisors, Dr. Mohammad Abdel Kareem Rasheed Jaradat, Associate Professor and Dr. Mamoun Fahed Saleh Abdel- Hafez, Associate Professor of department of mechanical engineering, American Uni- versity of Sharjah, for their support and guidance, and for providing me with the op- portunity to work on a thesis related to my own area of interest. I would also like to thank Dr. Ali Jhemi, my previous supervisor, for his patience and support. My deep- est thanks go to my parents and my sister for their kind and continuous support all the way throughout my life. I would also like to thank my fellow students Alexander Ad- veev, Hesam Afzali, Muhannad Al-Omari, Rabiya Ahmed, Bara Emran, Syed Ali Rizvi and Amina Amoor for their friendship and for the good time that we spent together, as Mechatronics students at AUS. I am also greatful for Ehab Al-khatib and Wasim Al- Masri for their immense assistance during the testing phase of the thesis. Furthermore, I would like to thank Mr. Kent Bernales Roferos and Mr. John Mempin for their help and for valuable suggestions in the Mechatronics lab of AUS. Finally, I gratefully acknowl- edge the graduate assistantship provided to me by AUS, without which the present study could not have been accomplished. Abstract This thesis presents an approach for solving the global navigation problem of wheeled mobile robots. The presented solution for outdoor navigation uses Extended Kalman Filter (EKF) to estimate the robot location based on the measurements from Global Po- sitioning System (GPS), inertial measurement unit (IMU) and wheel encoders. For in- door navigation (where GPS signals are blocked) another probabilistic approach, based on Monte Carlo Localization (MCL), is used for localization. This algorithm utilizes the map of the environment to estimate the posterior of the robot using the depth mea- surements from a Kinect sensor. The output from the Kinect sensor is processed to imitate the output of a 2D laser scanner by projecting the points from a thin horizontal strip of pixels in the image plane to the corresponding real world 3D coordinates using the pin-hole camera model. Two different controllers based on Dynamic Feedback Lin- earization (DFL) and Input-Output State Feedback Linearization (I-O SFL) have been analyzed, simulated and compared. Based on the thesis objective and the simulated results, the I-O SFL method was chosen for solving the trajectory tracking problem. A set of test experiments was conducted to evaluate the performance of the proposed system in outdoor, indoor and a combination of both environments. The results show that the robot can successfully navigate through the way-points with a great accuracy in indoor environments, while the accuracy in outdoor environments is within the 3m position accuracy of the GPS. Search Terms: Navigation, Extended Kalman Filter, Monte Carlo Localization, In- put Output State Feedback Linearization, Dynamic Feedback Linearization, Kinect, Depth Camera 5 Table of Contents Abstract . 5 List of Figures . 8 List of Tables . 12 List of Abbreviations . 13 1. Introduction . 15 1.1 Background . 15 1.2 The Navigation Problem . 16 1.3 Applications . 17 1.4 Literature review . 18 1.5 Software . 26 1.5.1 Programming language . 26 1.5.2 Used Libraries . 26 1.6 Hardware . 27 1.7 Thesis Overview . 28 2. System Setup . 29 2.1 Communication . 31 3. Robot Motion . 36 3.1 Introduction . 36 3.2 Motion Model . 36 3.2.1 Unicycle . 36 3.2.2 Exact Motion . 37 4. Trajectory Following . 40 4.1 Motion Control . 40 4.2 Dynamic Feedback Linearization . 41 4.3 Input-Output State Feedback Linearization . 48 5. Outdoor Navigation . 55 5.1 Extended Kalman Filter . 56 5.1.1 Filter Design . 57 5.1.2 Simulation . 60 6 5.2 Navigation Software . 64 5.2.1 GUI . 64 5.2.2 Geodetic Transformation . 64 5.3 Practical Results . 69 6. Indoor Navigation . 75 6.1 Monte Carlo Localization (MCL) . 75 6.2 Kinect . 79 6.2.1 Calibration . 79 6.2.2 Measurement . 82 6.3 Simulation . 87 6.4 Practical Results . 89 7. Hybrid System (Indoor/Outdoor) . 92 7.1 Practical Result . 94 8. Conclusion and Future Work . 95 8.1 Summary . 95 8.2 Future Work . 96 References . 97 Vita . 103 7 List of Figures Figure 1: Why multi-sensor data fusion? . 16 Figure 2: A comparison of the posture stabilization controllers implemented on SUPERMARIO [35]. 21 Figure 3: Particle filters have been used successfully for on-board localization of soccer-playing Aibo robots with as few as 50 particles [47]. 25 Figure 4: General Hardware Setup . 29 Figure 5: Block Diagram of the system . 30 Figure 6: mBin Packet Frame . 31 Figure 7: Fletcher Algorithm . 31 Figure 8: First Loop . 32 Figure 9: Second Loop: Initialize . 33 Figure 10: Second Loop: Get byte . 33 Figure 11: Second Loop: Check Sync . 34 Figure 12: Second Loop: Check Checksum . 34 Figure 13: Second Loop: Calculate Checksum . 34 Figure 14: Second Loop: Packet Received . 35 Figure 15: Third Loop . 35 Figure 16: Robot pose, in global configuration space . 37 Figure 17: Motion carried out by a noise-free robot moving with constant ve- locities (n) and (w), starting at (x y q)T . 38 Figure 18: Trajectory Following . 40 Figure 19: Steering Block(LabVIEW Robotics) . 41 Figure 20: Reference Trajectory of The Robot . 43 Figure 21: DFL: Reference and Robot Trajectories . 44 Figure 22: DFL: Robot motion: x(m);y(m);q(rad) . 44 Figure 23: DFL: Linear velocity . 45 Figure 24: DFL: Angular velocity . 45 Figure 25: DFL: Cartesian error (x,y) . 46 8 Figure 26: DFL: Norm of the error . 46 Figure 27: DFL: Robot following a square shaped trajectory.(brighter the color, the closer the robot is to the final position) . 47 Figure 28: DFL: Angular velocity of the robot following a square shaped tra- jectory. 48 Figure 29: Point B outside the wheel axle with distance b 6= 0 . 49 Figure 30: Pulling a toy car with a piece of rope . 49 Figure 31: I-O SFL: Reference and Robot Trajectories . 51 Figure 32: I-O SFL: Robot motion: x(m);y(m);q(rad) . 51 Figure 33: I-O SFL: Linear velocity . 52 Figure 34: I-O SFL: Angular velocity . 52 Figure 35: I-O SFL: Cartesian error (x,y) . 53 Figure 36: I-O SFL: Norm of the error . 53 Figure 37: I-O SFL: point B (yellow) is used by the robot to track a square shaped path . 54 Figure 39: The Extended Kalman filter algorithm (for full mathematical deriva- tion of the EKF refer to [20]) . 57 Figure 40: Flowchart: The basic outdoor navigation process . 60 Figure 41: EKF with encoder and heading measurement . 61 Figure 42: EKF with encoder, heading and noisy GPS measurements . 62 Figure 43: EKF is used with the I-O SFL algorithm to follow the square shaped trajectory . 63 Figure 44: Overall robot pose along the path(robot’s color becomes brighter as it gets closer to the end of the path) . 63 Figure 45: Main features of the GUI . 64 Figure 46: Map features include.
Recommended publications
  • Kinematic Modeling of a Rhex-Type Robot Using a Neural Network
    Kinematic Modeling of a RHex-type Robot Using a Neural Network Mario Harpera, James Paceb, Nikhil Guptab, Camilo Ordonezb, and Emmanuel G. Collins, Jr.b aDepartment of Scientific Computing, Florida State University, Tallahassee, FL, USA bDepartment of Mechanical Engineering, Florida A&M - Florida State University, College of Engineering, Tallahassee, FL, USA ABSTRACT Motion planning for legged machines such as RHex-type robots is far less developed than motion planning for wheeled vehicles. One of the main reasons for this is the lack of kinematic and dynamic models for such platforms. Physics based models are difficult to develop for legged robots due to the difficulty of modeling the robot-terrain interaction and their overall complexity. This paper presents a data driven approach in developing a kinematic model for the X-RHex Lite (XRL) platform. The methodology utilizes a feed-forward neural network to relate gait parameters to vehicle velocities. Keywords: Legged Robots, Kinematic Modeling, Neural Network 1. INTRODUCTION The motion of a legged vehicle is governed by the gait it uses to move. Stable gaits can provide significantly different speeds and types of motion. The goal of this paper is to use a neural network to relate the parameters that define the gait an X-RHex Lite (XRL) robot uses to move to angular, forward and lateral velocities. This is a critical step in developing a motion planner for a legged robot. The approach taken in this paper is very similar to approaches taken to learn forward predictive models for skid-steered robots. For example, this approach was used to learn a forward predictive model for the Crusher UGV (Unmanned Ground Vehicle).1 RHex-type robot may be viewed as a special case of skid-steered vehicle as their rotating C-legs are always pointed in the same direction.
    [Show full text]
  • AI, Robots, and Swarms: Issues, Questions, and Recommended Studies
    AI, Robots, and Swarms Issues, Questions, and Recommended Studies Andrew Ilachinski January 2017 Approved for Public Release; Distribution Unlimited. This document contains the best opinion of CNA at the time of issue. It does not necessarily represent the opinion of the sponsor. Distribution Approved for Public Release; Distribution Unlimited. Specific authority: N00014-11-D-0323. Copies of this document can be obtained through the Defense Technical Information Center at www.dtic.mil or contact CNA Document Control and Distribution Section at 703-824-2123. Photography Credits: http://www.darpa.mil/DDM_Gallery/Small_Gremlins_Web.jpg; http://4810-presscdn-0-38.pagely.netdna-cdn.com/wp-content/uploads/2015/01/ Robotics.jpg; http://i.kinja-img.com/gawker-edia/image/upload/18kxb5jw3e01ujpg.jpg Approved by: January 2017 Dr. David A. Broyles Special Activities and Innovation Operations Evaluation Group Copyright © 2017 CNA Abstract The military is on the cusp of a major technological revolution, in which warfare is conducted by unmanned and increasingly autonomous weapon systems. However, unlike the last “sea change,” during the Cold War, when advanced technologies were developed primarily by the Department of Defense (DoD), the key technology enablers today are being developed mostly in the commercial world. This study looks at the state-of-the-art of AI, machine-learning, and robot technologies, and their potential future military implications for autonomous (and semi-autonomous) weapon systems. While no one can predict how AI will evolve or predict its impact on the development of military autonomous systems, it is possible to anticipate many of the conceptual, technical, and operational challenges that DoD will face as it increasingly turns to AI-based technologies.
    [Show full text]
  • Assessment of Couple Unmanned Aerial Vehicle-Portable Magnetic
    International Journal of Scientific & Engineering Research Volume 11, Issue 1, January-2020 1271 ISSN 2229-5518 Assessment of Couple Unmanned Aerial Vehicle- Portable Magnetic Resonance Imaging Sensor for Precision Agriculture of Crops Balogun Wasiu A., Keshinro K.K, Momoh-Jimoh E. Salami, Adegoke A. S., Adesanya G. E., Obadare Bolatito J Abstract— This paper proposes an improved benefits of integration of Unmanned Aerial vehicle (UAV) and Portable Magnetic Resonance Imaging (PMRI) system in relation to developing Precision Agricultural (PA) that is capable of carrying out a dynamic internal quality assessment of crop developmental stages during pre-harvest period. A comprehensive review was carried out on application of UAV technology to Precision Agriculture (PA) progress and evolving research work on developing a Portable MRI system. The disadvantages of using traditional way of obtaining Precision Agricultural (PA) were shown. Proposed Model Architecture of Coupled Unmanned Aerial vehicle (UAV) and Portable Magnetic Resonance Imaging (PMRI) System were developed and the advantages of the proposed system were displayed. It is predicted that the proposed method can be useful for Precision Agriculture (PA) which will reduce the cost implication of Pre-harvest processing of farm produce based on non-destructive technique. Index Terms— Unmanned Aerial vehicle, Portable Magnetic Resonance Imaging, Precision Agricultural, Non-destructive, Pre-harvest, Crops, Imaging, Permanent magnet —————————— —————————— 1 INTRODUCTION URING agricultural processing, quality control ensure Inspection of fresh fruit bunches (FFB) for harvesting is a rig- that food products meet certain quality and safety stand- orous assignment which can be overlooked or applied physi- D ards in a developed country [FAO]. Low profits in fruit cal counting that are usually not accurate or precise (Sham- farming are subjected by pre- and post-harvest factors.
    [Show full text]
  • Geting (Parakilas and Bryce, 2018; ICRC, 2019; Abaimov and Martellini, 2020)
    Journal of Future Robot Life -1 (2021) 1–24 1 DOI 10.3233/FRL-200019 IOS Press Algorithmic fog of war: When lack of transparency violates the law of armed conflict Jonathan Kwik ∗ and Tom Van Engers Faculty of Law, University of Amsterdam, Amsterdam 1001NA, Netherlands Abstract. Underinternationallaw,weapon capabilitiesand theiruse areregulated by legalrequirementsset by International Humanitarian Law (IHL).Currently,there arestrong military incentivestoequip capabilitieswith increasinglyadvanced artificial intelligence (AI), whichinclude opaque (lesstransparent)models.As opaque models sacrifice transparency for performance, it is necessary toexaminewhether theiruse remainsinconformity with IHL obligations.First,wedemon- strate that theincentivesfor automationdriveAI toward complextaskareas and dynamicand unstructuredenvironments, whichinturnnecessitatesresorttomore opaque solutions.Wesubsequently discussthe ramifications of opaque models for foreseeability andexplainability.Then, we analysetheir impact on IHLrequirements froma development, pre-deployment and post-deployment perspective.We findthatwhile IHL does not regulate opaqueAI directly,the lack of foreseeability and explainability frustratesthe fulfilmentofkey IHLrequirementstotheextent that theuse of fully opaqueAI couldviolate internationallaw.Statesare urgedtoimplement interpretability duringdevelopmentand seriously consider thechallenging complicationofdetermining theappropriate balancebetween transparency andperformance in their capabilities. Keywords:Transparency, interpretability,foreseeability,weapon,
    [Show full text]
  • Variety of Research Work Has Already Been Done to Develop an Effective Navigation Systems for Unmanned Ground Vehicle
    Design of a Smart Unmanned Ground Vehicle for Hazardous Environments Saurav Chakraborty Subhadip Basu Tyfone Communications Development (I) Pvt. Ltd. Computer Science & Engineering. Dept. ITPL, White Field, Jadavpur University Bangalore 560092, INDIA. Kolkata – 700032, INDIA Abstract. A smart Unmanned Ground Vehicle needs some organizing principle, based on the (UGV) is designed and developed for some characteristics of each system such as: application specific missions to operate predominantly in hazardous environments. In The purpose of the development effort our work, we have developed a small and (often the performance of some application- lightweight vehicle to operate in general cross- specific mission); country terrains in or without daylight. The The specific reasons for choosing a UGV UGV can send visual feedbacks to the operator solution for the application (e.g., hazardous at a remote location. Onboard infrared sensors environment, strength or endurance can detect the obstacles around the UGV and requirements, size limitation etc.); sends signals to the operator. the technological challenges, in terms of functionality, performance, or cost, posed by Key Words. Unmanned Ground Vehicle, the application; Navigation Control, Onboard Sensor. The system's intended operating area (e.g., indoor environments, anywhere indoors, 1. Introduction outdoors on roads, general cross-country Robotics is an important field of interest in terrain, the deep seafloor, etc.); modern age of automation. Unlike human being the vehicle's mode of locomotion (e.g., a computer controlled robot can work with speed wheels, tracks, or legs); and accuracy without feeling exhausted. A robot How the vehicle's path is determined (i.e., can also perform preassigned tasks in a control and navigation techniques hazardous environment, reducing the risks and employed).
    [Show full text]
  • Increasing the Trafficability of Unmanned Ground Vehicles Through Intelligent Morphing
    Increasing the Trafficability of Unmanned Ground Vehicles through Intelligent Morphing Siddharth Odedra 1, Dr Stephen Prior 1, Dr Mehmet Karamanoglu 1, Dr Siu-Tsen Shen 2 1Product Design and Engineering, Middlesex University Trent Park Campus, Bramley Road, London N14 4YZ, U.K. [email protected] [email protected] [email protected] 2Department of Multimedia Design, National Formosa University 64 Wen-Hua Road, Hu-Wei 63208, YunLin County, Taiwan R.O.C. [email protected] Abstract Unmanned systems are used where humans are either 2. UNMANNED GROUND VEHICLES unable or unwilling to operate, but only if they can perform as Unmanned Ground Vehicles can be defined as good as, if not better than us. Systems must become more mechanised systems that operate on ground surfaces and autonomous so that they can operate without assistance, relieving the burden of controlling and monitoring them, and to do that they serve as an extension of human capabilities in unreachable need to be more intelligent and highly capable. In terms of ground or unsafe areas. They are used for many things such as vehicles, their primary objective is to be able to travel from A to B cleaning, transportation, security, exploration, rescue and where the systems success or failure is determined by its mobility, bomb disposal. UGV’s come in many different for which terrain is the key element. This paper explores the configurations usually defined by the task at hand and the concept of creating a more autonomous system by making it more environment they must operate in, and are either remotely perceptive about the terrain, and with reconfigurable elements, controlled by the user, pre-programmed to carry out specific making it more capable of traversing it.
    [Show full text]
  • Sensors and Measurements for Unmanned Systems: an Overview
    sensors Review Sensors and Measurements for Unmanned Systems: An Overview Eulalia Balestrieri 1,* , Pasquale Daponte 1, Luca De Vito 1 and Francesco Lamonaca 2 1 Department of Engineering, University of Sannio, 82100 Benevento, Italy; [email protected] (P.D.); [email protected] (L.D.V.) 2 Department of Computer Science, Modeling, Electronics and Systems (DIMES), University of Calabria, 87036 Rende, CS, Italy; [email protected] * Correspondence: [email protected] Abstract: The advance of technology has enabled the development of unmanned systems/vehicles used in the air, on the ground or on/in the water. The application range for these systems is continuously increasing, and unmanned platforms continue to be the subject of numerous studies and research contributions. This paper deals with the role of sensors and measurements in ensuring that unmanned systems work properly, meet the requirements of the target application, provide and increase their navigation capabilities, and suitably monitor and gain information on several physical quantities in the environment around them. Unmanned system types and the critical environmental factors affecting their performance are discussed. The measurements that these kinds of vehicles can carry out are presented and discussed, while also describing the most frequently used on-board sensor technologies, as well as their advantages and limitations. The paper provides some examples of sensor specifications related to some current applications, as well as describing the recent research contributions in the field. Citation: Balestrieri, E.; Daponte, P.; Keywords: unmanned systems; UAV; UGV; USV; UUV; sensors; payload; challenges De Vito, L.; Lamonaca, F. Sensors and Measurements for Unmanned Systems: An Overview.
    [Show full text]
  • Design of an Open Source-Based Control Platform for an Underwater Remotely Operated Vehicle Diseño De Una Plataforma De Control
    Design of an open source-based control platform for an underwater remotely operated vehicle Luis M. Aristizábal a, Santiago Rúa b, Carlos E. Gaviria c, Sandra P. Osorio d, Carlos A. Zuluaga e, Norha L. Posada f & Rafael E. Vásquez g Escuela de Ingenierías, Universidad Pontificia Bolivariana, Medellín Colombia. a [email protected], b [email protected], c [email protected], d [email protected], e [email protected], f [email protected], g [email protected] Received: March 25th, de 2015. Received in revised form: August 31th, 2015. Accepted: September 9th, 2015 Abstract This paper reports on the design of an open source-based control platform for the underwater remotely operated vehicle (ROV) Visor3. The vehicle’s original closed source-based control platform is first described. Due to the limitations of the previous infrastructure, modularity and flexibility are identified as the main guidelines for the proposed design. This new design includes hardware, firmware, software, and control architectures. Open-source hardware and software platforms are used for the development of the new system’s architecture, with support from the literature and the extensive experience acquired with the development of robotic exploration systems. This modular approach results in several frameworks that facilitate the functional expansion of the whole solution, the simplification of fault diagnosis and repair processes, and the reduction of development time, to mention a few. Keywords: open-source hardware; ROV control platforms; underwater exploration. Diseño de una plataforma de control basada en fuente abierta para un vehículo subacuático operado remotamente Resumen Este artículo presenta el diseño de una plataforma de control basada en fuente abierta para el vehículo subacuático operado remotamente (ROV) Visor3.
    [Show full text]
  • Design of a Reconnaissance and Surveillance Robot
    DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT A THESIS SUBMITTED TO THE GRADUATE SCHOOL OF NATURAL AND APPLIED SCIENCES OF MIDDLE EAST TECHNICAL UNIVERSITY BY ERMAN ÇAĞAN ÖZDEMİR IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE IN MECHANICAL ENGINEERING AUGUST 2013 Approval of the thesis: DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT submitted by ERMAN ÇAĞAN ÖZDEMİR in partial fulfillment of the requirements for the degree of Master of Science in Mechanical Engineering Department, Middle East Technical University by, Prof. Dr. Canan Özgen _____________________ Dean, Graduate School of Natural and Applied Sciences Prof. Dr. Süha Oral _____________________ Head of Department, Mechanical Engineering Prof. Dr. Eres Söylemez _____________________ Supervisor, Mechanical Engineering Dept., METU Examining Committee Members: Prof. Dr. Kemal İder _____________________ Mechanical Engineering Dept., METU Prof. Dr. Eres Söylemez _____________________ Mechanical Engineering Dept., METU Prof. Dr. Kemal Özgören _____________________ Mechanical Engineering Dept., METU Ass. Prof. Dr. Buğra Koku _____________________ Mechanical Engineering Dept., METU Alper Erdener, M.Sc. _____________________ Project Manager, Unmanned Systems, ASELSAN Date: 29/08/2013 I hereby declare that all information in this document has been obtained and presented in accordance with academic rules and ethical conduct. I also declare that, as required by these rules and conduct, I have fully cited and referenced all material and results that are not original to this work. Name, Last name: Erman Çağan ÖZDEMİR Signature: iii ABSTRACT DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT Özdemir, Erman Çağan M.Sc., Department of Mechanical Engineering Supervisor: Prof. Dr. Eres Söylemez August 2013, 72 pages Scope of this thesis is to design a man portable robot which is capable of carrying out reconnaissance and surveillance missions.
    [Show full text]
  • (12) United States Patent (10) Patent No.: US 9,493.235 B2 Zhou Et Al
    USOO949.3235B2 (12) United States Patent (10) Patent No.: US 9,493.235 B2 Zhou et al. (45) Date of Patent: Nov. 15, 2016 (54) AMPHIBIOUS VERTICAL TAKEOFF AND (58) Field of Classification Search LANDING UNMANNED DEVICE CPC ................. B64C 29/0033; B64C 35/00; B64C 35/008; B64C 2201/088: B60F 3/0061; (71) Applicants: Dylan T X Zhou, Tiburon, CA (US); B6OF 5/02 Andrew H B Zhou, Tiburon, CA (US); See application file for complete search history. Tiger T G Zhou, Tiburon, CA (US) (72) Inventors: Dylan T X Zhou, Tiburon, CA (US); (56) References Cited Andrew H B Zhou, Tiburon, CA (US); Tiger T G Zhou, Tiburon, CA (US) U.S. PATENT DOCUMENTS (*) Notice: Subject to any disclaimer, the term of this 2.989,269 A * 6/1961 Le Bel ................ B64C 99. patent is extended or adjusted under 35 3,029,042 A 4, 1962 Martin ...................... B6OF 3.00 U.S.C. 154(b) by 0 days. 180,119 (21) Appl. No.: 14/940,379 (Continued) (22) Filed: Nov. 13, 2015 Primary Examiner — Joseph W. Sanderson (74) Attorney, Agent, or Firm — Georgiy L. Khayet (65) Prior Publication Data Related U.S. Application Data An amphibious vertical takeoff and landing (VTOL) (63) Continuation-in-part of application No. 13/875.311, unmanned device includes a modular and expandable water filed on May 2, 2013, now abandoned, and a proof body. An outer body shell, at least one wing, and a (Continued) door are connected to the modular and expandable water proof body. A propulsion system of the amphibious VTOL (51) Int.
    [Show full text]
  • US Ground Forces Robotics and Autonomous Systems (RAS)
    U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI): Considerations for Congress Updated November 20, 2018 Congressional Research Service https://crsreports.congress.gov R45392 U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI) Summary The nexus of robotics and autonomous systems (RAS) and artificial intelligence (AI) has the potential to change the nature of warfare. RAS offers the possibility of a wide range of platforms—not just weapon systems—that can perform “dull, dangerous, and dirty” tasks— potentially reducing the risks to soldiers and Marines and possibly resulting in a generation of less expensive ground systems. Other nations, notably peer competitors Russia and China, are aggressively pursuing RAS and AI for a variety of military uses, raising considerations about the U.S. military’s response—to include lethal autonomous weapons systems (LAWS)—that could be used against U.S. forces. The adoption of RAS and AI by U.S. ground forces carries with it a number of possible implications, including potentially improved performance and reduced risk to soldiers and Marines; potential new force designs; better institutional support to combat forces; potential new operational concepts; and possible new models for recruiting and retaining soldiers and Marines. The Army and Marines have developed and are executing RAS and AI strategies that articulate near-, mid-, and long-term priorities. Both services have a number of RAS and AI efforts underway and are cooperating in a number of areas. A fully manned, capable, and well-trained workforce is a key component of military readiness. The integration of RAS and AI into military units raises a number of personnel-related issues that may be of interest to Congress, including unit manning changes, recruiting and retention of those with advanced technical skills, training, and career paths.
    [Show full text]
  • A Biologically Inspired Robot for Assistance in Urban Search and Rescue
    A Biologically Inspired Robot for Assistance in Urban Search and Rescue by ALEXANDER HUNT Submitted in partial fulfillment of the requirements For the degree of Master of Science in Mechanical Engineering Advisor: Dr. Roger Quinn Department of Mechanical and Aerospace Engineering CASE WESTERN RESERVE UNIVERSITY May 2010 Table of Contents A Biologically Inspired Robot for Assistance in Urban Search and Rescue ...................................... 0 Table of Contents ............................................................................................................................. 1 List of Tables .................................................................................................................................... 3 List of Figures ................................................................................................................................... 4 Acknowledgements.......................................................................................................................... 6 Abstract ............................................................................................................................................ 7 Chapter 1: Introduction ................................................................................................................... 8 1.1 Search and Rescue ................................................................................................................. 8 1.2 Robots in Search and Rescue ................................................................................................
    [Show full text]