Autonomous Unmanned Ground Vehicle and Indirect Driving

Total Page:16

File Type:pdf, Size:1020Kb

Autonomous Unmanned Ground Vehicle and Indirect Driving Autonomous Unmanned Ground Vehicle and Indirect Driving ABSTRACT This paper describes the design and challenges faced in the development of an Autonomous Unmanned Ground Vehicle (AUGV) demonstrator and an Indirect Driving Testbed (IDT). The AUGV has been modified from an M113 Armoured Personnel Carrier and has capabilities such as remote control, autonomous waypoint seeking, obstacle avoidance, road following and vehicle following. The IDT also enables the driver of an armoured vehicle to drive by looking at live video feed from cameras mounted on the exterior of the vehicle. These two technologies are complementary, and integration of the two technology enablers could pave the way for the deployment of semi-autonomous unmanned (and even manned) vehicles in battles. Ng Keok Boon Tey Hwee Choo Chan Chun Wah Autonomous Unmanned Ground Vehicle and Indirect Driving ABSTRACT This paper describes the design and challenges faced in the development of an Autonomous Unmanned Ground Vehicle (AUGV) demonstrator and an Indirect Driving Testbed (IDT). The AUGV has been modified from an M113 Armoured Personnel Carrier and has capabilities such as remote control, autonomous waypoint seeking, obstacle avoidance, road following and vehicle following. The IDT also enables the driver of an armoured vehicle to drive by looking at live video feed from cameras mounted on the exterior of the vehicle. These two technologies are complementary, and integration of the two technology enablers could pave the way for the deployment of semi-autonomous unmanned (and even manned) vehicles in battles. Ng Keok Boon Tey Hwee Choo Chan Chun Wah Autonomous Unmanned Ground Vehicle and Indirect Driving 54 AUGV has been tested. Section 4 describes the System Architecture We integrated Commercial-off-the-Shelf (COTS) 1. INTRODUCTION Indirect Vision Operations (IVO) and Indirect hardware with custom algorithms to gain Driving Demonstrator design. Section 5 lists Under the project, various enabling machine perception. The Unstructured Terrain In the last decade, advances in computing and the lessons learnt from both developments technologies were explored. Through this Vision Module (UTVM) uses COTS stereo vision networking technologies have brought and discusses the synergies and possible future project, we have built up local capabilities in system (a passive sensor) to provide 3D range battlefield robotics another step closer to concepts that may arise from the integration these technologies. Such local capabilities data, whilst the Laser Scanner Module (LSM) reality. Robotics will play an increasingly of the two technologies. Section 6 then reports would benefit us in the future fielding of provides “2 1/2 D“ range data (the laser provides important role in the future battlefield as the the synergies between the two technologies unmanned ground vehicles. The overall system 3D range but only in a plane). These two technology matures and allows force projection and how they can be exploited for the concept architecture of the AUGV is shown in modules are mainly “vision” sensors, as they and multiplication, with minimal exposure of of Teaming. Section 7 concludes this paper. Figure 3. The overall system consists of five only provide terrain elevation only. The Road our soldiers to hazardous situations and subsystems: Visual Guidance System (VGS), Segmentation Module (RSM) and Night Driving environments. Besides the maturation of Vehicle Piloting System (VPS), Vehicle Control Module (NDM) give the vehicle some robotics technologies, the battlefield will also 2. AUGV DESCRIPTION System (VCS), Teleoperation Control System perception capabilities. Through these two evolve into one that will see more unmanned (TCS) and Robust Communications System (RCS). modules, the vehicle is able to distinguish a operations, arising from the proliferation of The AUGV was designed to develop As seen, major subsystems have names ending road in day and night. The RSM makes use of wireless ad hoc networks and non line-of-sight autonomous technologies through using the in “system”. At the next level, the functions colour information to differentiate between weapons. With such technologies as robotics M113 as a prototype platform. From this that make up each subsystem have names roads and vegetation, and passes the road and data networks comes a rapid expansion project, we have developed significant local ending in “module”. centre information to the Vehicle Navigation of the boundaries of situational awareness know-how in technologies such as Machine Module (VNM), beyond what conventional sensors can give. Perception, Autonomous Navigation and while the NDM The future fighting vehicle will have to be re- Supervisory Control, Vehicle Control and uses an infrared invented to be more effective in this new Actuation and Teleoperation. environment. The crew will no longer need to camera to be exposed to the dangers of driving “hatch- Vehicle Platform differentiate up” and still be able to have the same or better between the warmer roads and situational awareness compared to operating The AUGV is based on a M113A2 tracked the cooler hatch-up. Such a concept, where the crew can Armoured Personnel Carrier (APC) as the base vegetation. The drive and operate the fighting vehicle under platform. Besides various sensors, it has to be Vehicle Following the protection of the hull, is termed Indirect retrofitted with actuators to control its steering Driving. By integrating Indirect Driving levers, gear change lever and accelerator pedal, Module (VFM) technology with robotic technologies, we can along with other actuators for safety. Due to Figure 3. System Architecture of AUGV makes use of the create a new concept of manned and the tracked locomotion and large mass, the laser scanners to unmanned platforms operating in synergy with control algorithm development was Visual Guidance System follow a known one another. challenging and required many trials in the vehicle. The field. Figure 1 shows the M113 integrated information from these sensors is fused by the One of the biggest challenges for autonomous This paper is organised into seven sections: with the sensors and Figure 2 is a CAD drawing Sensor Fusion Module (SFM) in different ways, capability is machine perception. Machine Section 2 describes the design of the AUGV. of the platform. perception entails the sensing and depending on the operating mode, to provide Section 3 focuses on the modes under which understanding (sensing + understanding = a combined map for the VNM. The VFM GPS receiver perception) of the environment so that the processes the laser scans of the target vehicle Vision Sensor suite consisting of stereo vision, and allows the AUGV to follow it. Figure 4 laser scanner, IR camera, CCD camera Reverse robotic vehicle can make informed and camera intelligent decisions. Machine perception is shows the sensors that make up difficult because we live in a 3D world, and the VGS. visual sensors will need to Safety collect and process a vast sensors amount of information in real time (or near real time). Modern computer 55 Inertial Measurement technology has just begun Vehicle Unit to make them portable and Actuation Figure 1. M113 integrated cheap enough for Figure 4. VGS sensors with sensors Figure 2. CAD drawing of robotic M113 field application. Autonomous Unmanned Ground Vehicle and Indirect Driving 54 AUGV has been tested. Section 4 describes the System Architecture We integrated Commercial-off-the-Shelf (COTS) 1. INTRODUCTION Indirect Vision Operations (IVO) and Indirect hardware with custom algorithms to gain Driving Demonstrator design. Section 5 lists Under the project, various enabling machine perception. The Unstructured Terrain In the last decade, advances in computing and the lessons learnt from both developments technologies were explored. Through this Vision Module (UTVM) uses COTS stereo vision networking technologies have brought and discusses the synergies and possible future project, we have built up local capabilities in system (a passive sensor) to provide 3D range battlefield robotics another step closer to concepts that may arise from the integration these technologies. Such local capabilities data, whilst the Laser Scanner Module (LSM) reality. Robotics will play an increasingly of the two technologies. Section 6 then reports would benefit us in the future fielding of provides “2 1/2 D“ range data (the laser provides important role in the future battlefield as the the synergies between the two technologies unmanned ground vehicles. The overall system 3D range but only in a plane). These two technology matures and allows force projection and how they can be exploited for the concept architecture of the AUGV is shown in modules are mainly “vision” sensors, as they and multiplication, with minimal exposure of of Teaming. Section 7 concludes this paper. Figure 3. The overall system consists of five only provide terrain elevation only. The Road our soldiers to hazardous situations and subsystems: Visual Guidance System (VGS), Segmentation Module (RSM) and Night Driving environments. Besides the maturation of Vehicle Piloting System (VPS), Vehicle Control Module (NDM) give the vehicle some robotics technologies, the battlefield will also 2. AUGV DESCRIPTION System (VCS), Teleoperation Control System perception capabilities. Through these two evolve into one that will see more unmanned (TCS) and Robust Communications System (RCS). modules, the vehicle is able to distinguish a operations, arising from the proliferation
Recommended publications
  • AI, Robots, and Swarms: Issues, Questions, and Recommended Studies
    AI, Robots, and Swarms Issues, Questions, and Recommended Studies Andrew Ilachinski January 2017 Approved for Public Release; Distribution Unlimited. This document contains the best opinion of CNA at the time of issue. It does not necessarily represent the opinion of the sponsor. Distribution Approved for Public Release; Distribution Unlimited. Specific authority: N00014-11-D-0323. Copies of this document can be obtained through the Defense Technical Information Center at www.dtic.mil or contact CNA Document Control and Distribution Section at 703-824-2123. Photography Credits: http://www.darpa.mil/DDM_Gallery/Small_Gremlins_Web.jpg; http://4810-presscdn-0-38.pagely.netdna-cdn.com/wp-content/uploads/2015/01/ Robotics.jpg; http://i.kinja-img.com/gawker-edia/image/upload/18kxb5jw3e01ujpg.jpg Approved by: January 2017 Dr. David A. Broyles Special Activities and Innovation Operations Evaluation Group Copyright © 2017 CNA Abstract The military is on the cusp of a major technological revolution, in which warfare is conducted by unmanned and increasingly autonomous weapon systems. However, unlike the last “sea change,” during the Cold War, when advanced technologies were developed primarily by the Department of Defense (DoD), the key technology enablers today are being developed mostly in the commercial world. This study looks at the state-of-the-art of AI, machine-learning, and robot technologies, and their potential future military implications for autonomous (and semi-autonomous) weapon systems. While no one can predict how AI will evolve or predict its impact on the development of military autonomous systems, it is possible to anticipate many of the conceptual, technical, and operational challenges that DoD will face as it increasingly turns to AI-based technologies.
    [Show full text]
  • Review of Advanced Medical Telerobots
    applied sciences Review Review of Advanced Medical Telerobots Sarmad Mehrdad 1,†, Fei Liu 2,† , Minh Tu Pham 3 , Arnaud Lelevé 3,* and S. Farokh Atashzar 1,4,5 1 Department of Electrical and Computer Engineering, New York University (NYU), Brooklyn, NY 11201, USA; [email protected] (S.M.); [email protected] (S.F.A.) 2 Advanced Robotics and Controls Lab, University of San Diego, San Diego, CA 92110, USA; [email protected] 3 Ampère, INSA Lyon, CNRS (UMR5005), F69621 Villeurbanne, France; [email protected] 4 Department of Mechanical and Aerospace Engineering, New York University (NYU), Brooklyn, NY 11201, USA 5 NYU WIRELESS, Brooklyn, NY 11201, USA * Correspondence: [email protected]; Tel.: +33-0472-436035 † Mehrdad and Liu contributed equally to this work and share the first authorship. Abstract: The advent of telerobotic systems has revolutionized various aspects of the industry and human life. This technology is designed to augment human sensorimotor capabilities to extend them beyond natural competence. Classic examples are space and underwater applications when distance and access are the two major physical barriers to be combated with this technology. In modern examples, telerobotic systems have been used in several clinical applications, including teleoperated surgery and telerehabilitation. In this regard, there has been a significant amount of research and development due to the major benefits in terms of medical outcomes. Recently telerobotic systems are combined with advanced artificial intelligence modules to better share the agency with the operator and open new doors of medical automation. In this review paper, we have provided a comprehensive analysis of the literature considering various topologies of telerobotic systems in the medical domain while shedding light on different levels of autonomy for this technology, starting from direct control, going up to command-tracking autonomous telerobots.
    [Show full text]
  • Panospheric Video for Robotic Telexploration
    Panospheric Video for Robotic Telexploration John R. Murphy CMU-RI-TR-98-10 Submined in partial fulfiient of the requirements for the degree of Doctor of Philosophy in Robotics The Robotics Institute Cknegie Mellon University Pittsburgh, Pennsylvania 15213 May 1998 0 1998 by John R. Murphy. All rights reserved. This research was supported in part by NASA grant NAGW-117.5. The views and conclusions contained in this document are those of the author and should not be interpreted as representing the official policies, either expressed or implied, of NASA or the U.S. Government. Robotics Thesis Panospheric Video for Robotic Telexploration John Murphy Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in the field of Robotics ACC-. William L. Whittaka TheEis Cormittee Chair Date Matthew T. Mason- - Program Chir Y Date APPROVED: fL-4 36 JiiIff+? Pa Christian0 Rowst Date - Abstract Teleoperation using cameras and monitors fails to achieve the rich and natural perceptions available during direct hands-on operation. Optimally. a reniote telepresence reproduces visual sensations that enable equivalent understanding and interaction as direct operation. Through irnrnersive video acquisition and display. the situational awareness ola remote operator is brought closer lo that of direct operation. Early teleoperation research considered the value of wide tields of view. Howe~er. utilization of wide fields of view meant sacrificing resolution and presenting distorted imagery to the viewer. With recent advances in technology. these problems are no longer barriers to hyper-wide field of view video. Acquisition and display of video conveying the entire surroundings of a tclcoperated mobile robot requires innolwtion of hardware and software.
    [Show full text]
  • Geting (Parakilas and Bryce, 2018; ICRC, 2019; Abaimov and Martellini, 2020)
    Journal of Future Robot Life -1 (2021) 1–24 1 DOI 10.3233/FRL-200019 IOS Press Algorithmic fog of war: When lack of transparency violates the law of armed conflict Jonathan Kwik ∗ and Tom Van Engers Faculty of Law, University of Amsterdam, Amsterdam 1001NA, Netherlands Abstract. Underinternationallaw,weapon capabilitiesand theiruse areregulated by legalrequirementsset by International Humanitarian Law (IHL).Currently,there arestrong military incentivestoequip capabilitieswith increasinglyadvanced artificial intelligence (AI), whichinclude opaque (lesstransparent)models.As opaque models sacrifice transparency for performance, it is necessary toexaminewhether theiruse remainsinconformity with IHL obligations.First,wedemon- strate that theincentivesfor automationdriveAI toward complextaskareas and dynamicand unstructuredenvironments, whichinturnnecessitatesresorttomore opaque solutions.Wesubsequently discussthe ramifications of opaque models for foreseeability andexplainability.Then, we analysetheir impact on IHLrequirements froma development, pre-deployment and post-deployment perspective.We findthatwhile IHL does not regulate opaqueAI directly,the lack of foreseeability and explainability frustratesthe fulfilmentofkey IHLrequirementstotheextent that theuse of fully opaqueAI couldviolate internationallaw.Statesare urgedtoimplement interpretability duringdevelopmentand seriously consider thechallenging complicationofdetermining theappropriate balancebetween transparency andperformance in their capabilities. Keywords:Transparency, interpretability,foreseeability,weapon,
    [Show full text]
  • Variety of Research Work Has Already Been Done to Develop an Effective Navigation Systems for Unmanned Ground Vehicle
    Design of a Smart Unmanned Ground Vehicle for Hazardous Environments Saurav Chakraborty Subhadip Basu Tyfone Communications Development (I) Pvt. Ltd. Computer Science & Engineering. Dept. ITPL, White Field, Jadavpur University Bangalore 560092, INDIA. Kolkata – 700032, INDIA Abstract. A smart Unmanned Ground Vehicle needs some organizing principle, based on the (UGV) is designed and developed for some characteristics of each system such as: application specific missions to operate predominantly in hazardous environments. In The purpose of the development effort our work, we have developed a small and (often the performance of some application- lightweight vehicle to operate in general cross- specific mission); country terrains in or without daylight. The The specific reasons for choosing a UGV UGV can send visual feedbacks to the operator solution for the application (e.g., hazardous at a remote location. Onboard infrared sensors environment, strength or endurance can detect the obstacles around the UGV and requirements, size limitation etc.); sends signals to the operator. the technological challenges, in terms of functionality, performance, or cost, posed by Key Words. Unmanned Ground Vehicle, the application; Navigation Control, Onboard Sensor. The system's intended operating area (e.g., indoor environments, anywhere indoors, 1. Introduction outdoors on roads, general cross-country Robotics is an important field of interest in terrain, the deep seafloor, etc.); modern age of automation. Unlike human being the vehicle's mode of locomotion (e.g., a computer controlled robot can work with speed wheels, tracks, or legs); and accuracy without feeling exhausted. A robot How the vehicle's path is determined (i.e., can also perform preassigned tasks in a control and navigation techniques hazardous environment, reducing the risks and employed).
    [Show full text]
  • Increasing the Trafficability of Unmanned Ground Vehicles Through Intelligent Morphing
    Increasing the Trafficability of Unmanned Ground Vehicles through Intelligent Morphing Siddharth Odedra 1, Dr Stephen Prior 1, Dr Mehmet Karamanoglu 1, Dr Siu-Tsen Shen 2 1Product Design and Engineering, Middlesex University Trent Park Campus, Bramley Road, London N14 4YZ, U.K. [email protected] [email protected] [email protected] 2Department of Multimedia Design, National Formosa University 64 Wen-Hua Road, Hu-Wei 63208, YunLin County, Taiwan R.O.C. [email protected] Abstract Unmanned systems are used where humans are either 2. UNMANNED GROUND VEHICLES unable or unwilling to operate, but only if they can perform as Unmanned Ground Vehicles can be defined as good as, if not better than us. Systems must become more mechanised systems that operate on ground surfaces and autonomous so that they can operate without assistance, relieving the burden of controlling and monitoring them, and to do that they serve as an extension of human capabilities in unreachable need to be more intelligent and highly capable. In terms of ground or unsafe areas. They are used for many things such as vehicles, their primary objective is to be able to travel from A to B cleaning, transportation, security, exploration, rescue and where the systems success or failure is determined by its mobility, bomb disposal. UGV’s come in many different for which terrain is the key element. This paper explores the configurations usually defined by the task at hand and the concept of creating a more autonomous system by making it more environment they must operate in, and are either remotely perceptive about the terrain, and with reconfigurable elements, controlled by the user, pre-programmed to carry out specific making it more capable of traversing it.
    [Show full text]
  • Sensors and Measurements for Unmanned Systems: an Overview
    sensors Review Sensors and Measurements for Unmanned Systems: An Overview Eulalia Balestrieri 1,* , Pasquale Daponte 1, Luca De Vito 1 and Francesco Lamonaca 2 1 Department of Engineering, University of Sannio, 82100 Benevento, Italy; [email protected] (P.D.); [email protected] (L.D.V.) 2 Department of Computer Science, Modeling, Electronics and Systems (DIMES), University of Calabria, 87036 Rende, CS, Italy; [email protected] * Correspondence: [email protected] Abstract: The advance of technology has enabled the development of unmanned systems/vehicles used in the air, on the ground or on/in the water. The application range for these systems is continuously increasing, and unmanned platforms continue to be the subject of numerous studies and research contributions. This paper deals with the role of sensors and measurements in ensuring that unmanned systems work properly, meet the requirements of the target application, provide and increase their navigation capabilities, and suitably monitor and gain information on several physical quantities in the environment around them. Unmanned system types and the critical environmental factors affecting their performance are discussed. The measurements that these kinds of vehicles can carry out are presented and discussed, while also describing the most frequently used on-board sensor technologies, as well as their advantages and limitations. The paper provides some examples of sensor specifications related to some current applications, as well as describing the recent research contributions in the field. Citation: Balestrieri, E.; Daponte, P.; Keywords: unmanned systems; UAV; UGV; USV; UUV; sensors; payload; challenges De Vito, L.; Lamonaca, F. Sensors and Measurements for Unmanned Systems: An Overview.
    [Show full text]
  • Teleoperation of a 7 DOF Humanoid Robot Arm Using Human Arm Accelerations and EMG Signals
    Teleoperation of a 7 DOF Humanoid Robot Arm Using Human Arm Accelerations and EMG Signals J. Leitner, M. Luciw, A. Forster,¨ J. Schmidhuber Dalle Molle Institute for Artificial Intelligence (IDSIA) & Scuola universitaria professionale della Svizzera Italiana (SUPSI) & Universita` della Svizzera italiana (USI), Switzerland e-mail: [email protected] Abstract 2 Background and Related Work We present our work on tele-operating a complex hu- The presented research combines a variety of subsys- manoid robot with the help of bio-signals collected from tems together to allow the safe and robust teleoperation of the operator. The frameworks (for robot vision, collision a complex humanoid robot. For the experiments herein, avoidance and machine learning), developed in our lab, we are using computer vision, together with accelerom- allow for a safe interaction with the environment, when eters and EMG signals collected on the operator’s arm combined. This even works with noisy control signals, to remotely control the 7 degree-of-freedom (DOF) robot such as, the operator’s hand acceleration and their elec- arm – and the 5 fingered end-effector – during reaching tromyography (EMG) signals. These bio-signals are used and grasping tasks. to execute equivalent actions (such as, reaching and grasp- ing of objects) on the 7 DOF arm. 2.1 Robotic Teleoperation Telerobotics usually refers to the remote control of 1 Introduction robotic systems over a (wireless) link. First use of tele- robotics dates back to the first robotic space probes used More and more robots are sent to distant or dangerous during the 20-th century. Remote manipulators are nowa- environments to perform a variety of tasks.
    [Show full text]
  • Design of a Reconnaissance and Surveillance Robot
    DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT A THESIS SUBMITTED TO THE GRADUATE SCHOOL OF NATURAL AND APPLIED SCIENCES OF MIDDLE EAST TECHNICAL UNIVERSITY BY ERMAN ÇAĞAN ÖZDEMİR IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE IN MECHANICAL ENGINEERING AUGUST 2013 Approval of the thesis: DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT submitted by ERMAN ÇAĞAN ÖZDEMİR in partial fulfillment of the requirements for the degree of Master of Science in Mechanical Engineering Department, Middle East Technical University by, Prof. Dr. Canan Özgen _____________________ Dean, Graduate School of Natural and Applied Sciences Prof. Dr. Süha Oral _____________________ Head of Department, Mechanical Engineering Prof. Dr. Eres Söylemez _____________________ Supervisor, Mechanical Engineering Dept., METU Examining Committee Members: Prof. Dr. Kemal İder _____________________ Mechanical Engineering Dept., METU Prof. Dr. Eres Söylemez _____________________ Mechanical Engineering Dept., METU Prof. Dr. Kemal Özgören _____________________ Mechanical Engineering Dept., METU Ass. Prof. Dr. Buğra Koku _____________________ Mechanical Engineering Dept., METU Alper Erdener, M.Sc. _____________________ Project Manager, Unmanned Systems, ASELSAN Date: 29/08/2013 I hereby declare that all information in this document has been obtained and presented in accordance with academic rules and ethical conduct. I also declare that, as required by these rules and conduct, I have fully cited and referenced all material and results that are not original to this work. Name, Last name: Erman Çağan ÖZDEMİR Signature: iii ABSTRACT DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT Özdemir, Erman Çağan M.Sc., Department of Mechanical Engineering Supervisor: Prof. Dr. Eres Söylemez August 2013, 72 pages Scope of this thesis is to design a man portable robot which is capable of carrying out reconnaissance and surveillance missions.
    [Show full text]
  • US Ground Forces Robotics and Autonomous Systems (RAS)
    U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI): Considerations for Congress Updated November 20, 2018 Congressional Research Service https://crsreports.congress.gov R45392 U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI) Summary The nexus of robotics and autonomous systems (RAS) and artificial intelligence (AI) has the potential to change the nature of warfare. RAS offers the possibility of a wide range of platforms—not just weapon systems—that can perform “dull, dangerous, and dirty” tasks— potentially reducing the risks to soldiers and Marines and possibly resulting in a generation of less expensive ground systems. Other nations, notably peer competitors Russia and China, are aggressively pursuing RAS and AI for a variety of military uses, raising considerations about the U.S. military’s response—to include lethal autonomous weapons systems (LAWS)—that could be used against U.S. forces. The adoption of RAS and AI by U.S. ground forces carries with it a number of possible implications, including potentially improved performance and reduced risk to soldiers and Marines; potential new force designs; better institutional support to combat forces; potential new operational concepts; and possible new models for recruiting and retaining soldiers and Marines. The Army and Marines have developed and are executing RAS and AI strategies that articulate near-, mid-, and long-term priorities. Both services have a number of RAS and AI efforts underway and are cooperating in a number of areas. A fully manned, capable, and well-trained workforce is a key component of military readiness. The integration of RAS and AI into military units raises a number of personnel-related issues that may be of interest to Congress, including unit manning changes, recruiting and retention of those with advanced technical skills, training, and career paths.
    [Show full text]
  • Motion Control Design for Unmanned Ground Vehicle in Dynamic Environment Using Intelligent Controller
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Sussex Research Online Motion Control Design for Unmanned Ground Vehicle in Dynamic Environment Using Intelligent Controller Auday Al-Mayyahi, Weiji Wang, Alaa Hussien and Philip Birch Department of Engineering and Design, University of Sussex, Brighton, United Kingdom Abstract Purpose - The motion control of unmanned ground vehicles is a challenge in the industry of automation. In this paper, a fuzzy inference system based on sensory information is proposed for the purpose of solving the navigation challenge of unmanned ground vehicles in cluttered and dynamic environments. Design/methodology/approach - The representation of the dynamic environment is a key element for the operational field and for the testing of the robotic navigation system. If dynamic obstacles move randomly in the operation field, the navigation problem becomes more complicated due to the coordination of the elements for accurate navigation and collision-free path within the environmental representations. This paper considers the construction of the fuzzy inference system which consists of two controllers. The first controller uses three sensors based on the obstacles distances from the front, right and left. The second controller employs the angle difference between the heading of the vehicle and the targeted angle to obtain the optimal route based on the environment and reach the desired destination with minimal running power and delay. The proposed design shows an efficient navigation strategy that overcomes the current navigation challenges in dynamic environments. Findings - Experimental analyses conducted for three different scenarios to investigate the validation and effectiveness of the introduced controllers based on the fuzzy inference system.
    [Show full text]
  • Robotics in Scansorial Environments
    University of Pennsylvania ScholarlyCommons Departmental Papers (ESE) Department of Electrical & Systems Engineering 2005 Robotics in Scansorial Environments Kellar Autumn Lewis & Clark College Martin Buehler Mark Cutkosky Stanford University Ronald Fearing University of California - Berkeley Robert J. Full University of California - Berkeley See next page for additional authors Follow this and additional works at: https://repository.upenn.edu/ese_papers Part of the Electrical and Computer Engineering Commons, and the Systems Engineering Commons Recommended Citation Kellar Autumn, Martin Buehler, Mark Cutkosky, Ronald Fearing, Robert J. Full, Daniel Goldman, Richard Groff, William Provancher, Alfred A. Rizzi, Uluc Saranli, Aaron Saunders, and Daniel Koditschek, "Robotics in Scansorial Environments", Proceedings of SPIE 5804, 291-302. January 2005. http://dx.doi.org/ 10.1117/12.606157 This paper is posted at ScholarlyCommons. https://repository.upenn.edu/ese_papers/877 For more information, please contact [email protected]. Robotics in Scansorial Environments Abstract We review a large multidisciplinary effort to develop a family of autonomous robots capable of rapid, agile maneuvers in and around natural and artificial erv tical terrains such as walls, cliffs, caves, trees and rubble. Our robot designs are inspired by (but not direct copies of) biological climbers such as cockroaches, geckos, and squirrels. We are incorporating advanced materials (e.g., synthetic gecko hairs) into these designs and fabricating them using state of the art rapid prototyping techniques (e.g., shape deposition manufacturing) that permit multiple iterations of design and testing with an effective integration path for the novel materials and components. We are developing novel motion control techniques to support dexterous climbing behaviors that are inspired by neuroethological studies of animals and descended from earlier frameworks that have proven analytically tractable and empirically sound.
    [Show full text]