Hybrid Sensor Fusion for Unmanned Ground Vehicle

Total Page:16

File Type:pdf, Size:1020Kb

Hybrid Sensor Fusion for Unmanned Ground Vehicle This document is downloaded from DR‑NTU (https://dr.ntu.edu.sg) Nanyang Technological University, Singapore. Hybrid sensor fusion for unmanned ground vehicle Guan, Mingyang 2020 Guan, M. (2020). Hybrid sensor fusion for unmanned ground vehicle. Doctoral thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/144485 https://doi.org/10.32657/10356/144485 This work is licensed under a Creative Commons Attribution‑NonCommercial 4.0 International License (CC BY‑NC 4.0). Downloaded on 09 Oct 2021 04:08:30 SGT HYBRID SENSOR FUSION FOR UNMANNED GROUND VEHICLE MINGYANG GUAN School of Electrical & Electronic Engineering A thesis submitted to the Nanyang Technological University in partial fulfillment of the requirements for the degree of Doctor of Philosophy 2020 Statement of Originality I hereby certify that the work embodied in this thesis is the result of original research, is free of plagiarised materials, and has not been submitted for a higher degree to any other University or Institution. 25-03-2020 . Date Mingyang Guan Supervisor Declaration Statement I have reviewed the content and presentation style of this thesis and declare it is free of plagiarism and of sufficient grammatical clarity to be examined. To the best of my knowledge, the research and writing are those of the candidate except as acknowledged in the Author Attribution Statement. I confirm that the investigations were conducted in accord with the ethics policies and integrity standards of Nanyang Technological University and that the research data are presented honestly and without prejudice. 25-03-2020 . Date Changyun Wen Authorship Attribution Statement This thesis contains material from 2 papers published in the following peer-reviewed journal and conference, and 3 papers under reviewing where I was the first or joint first authors. Chapter 3 is published as Guan, M., Wen, C., Shan, M., Ng, C. L., and Zou, Y. Real- time event-triggered object tracking in the presence of model drift and occlusion. IEEE Transactions on Industrial Electronics, 66(3), 2054-2065 (2018). DOI: 10.1109/TIE.2018.2835390. The contributions of the co-authors are as follows: Prof. Wen provided the initial idea. I designed the event-triggered tracking framework and the algorithm of online object relocation. I co-designed the study with Prof Wen and performed all the experimental work at the ST-NTU Corplab. I also analyzed the data. Mr. Ng and Ms. Zhou assisted to collect the experimental results. I prepared the manuscript drafts. The manuscript was revised together with Prof. Wen and Dr. Shan. Chapter 4 is accepted as Guan, M., and Wen, C. Adaptive Multi-feature Reliability Re-determination Correlation Filter for Visual Tracking. IEEE Transactions on Multimedia. The contributions of the co-authors are as follows: Prof. Wen provided the initial idea. I co-designed the study with Prof Wen and performed all the experimental work at the ST-NTU Corplab. I proposed the tracking framework and two solutions on finding the reliability score of each feature. I prepared the manuscript drafts which were revised by Prof Wen. Chapter 5 is published as Song, Y*, Guan, M*, Tay, W.P., Law, C.L. and Wen, C., UWB/LiDAR Fusion For Cooperative Range-Only SLAM. IEEE International Conference on Robotics and Automation (ICRA), pp. 6568-6574, 2019 May. DOI: 10.1109/ICRA.2019.8794222. The authors with * are the jointly first author of this publication. The contributions of the co-authors are as follows: Prof. Wen suggested the idea on the fusion of UWB/LiDAR. I wrote the drafts related to LiDAR SLAM, and Dr. Song prepared the drafts related to UWB localization. I co-designed the fusion framework with Dr. Song. I designed and implemented all the experiments for the proposed method. The manuscript was revised together with Prof. Wen and Dr. Song. Dr. Song implemented the experiments related to the UWB localization. Prof. Tay and Prof. Law provided the advices on UWB sensors. Chapter 6 is under reviewing as Guan, M., Wen, C., Song, Y., and Tay, W.P., Autonomous Exploration Using UWB and LiDAR. IEEE Transactions on Industrial Electronics. The contributions of the co-authors are as follows: Prof. Wen suggested the idea of fusing UWB/LiDAR for autonomous exploration. I proposed a particle filter based step-by-step optimization framework to refine the state of robot and UWB beacons. I prepared the manuscript drafts. The manuscript was revised together with Prof. Wen and Dr. Song. Prof. Tay provided the advice on UWB sensors. I co-designed the study with Prof Wen and performed all the experimental work at the ST-NTU Corplab. Chapter 7 is under reviewing as Guan, M., Wen, C., and Song, Y. Autonomous Exploration via Region-aware Least-explored Guided Rapidly-exploring Random Trees. Journal of Field Robotics. The contributions of the co-authors are as follows: Prof. Wen suggested the idea of fusion UWB/LiDAR for autonomous exploration. I proposed a dual-UWB robot system and a least-explored guided RRTs for autonomous exploration. I prepared the manuscript drafts. The manuscript was revised together with Prof. Wen and Dr. Song. I co-designed the study with Prof Wen and performed all the experimental work at the ST-NTU Corplab. 25-03-2020 . Date Mingyang Guan Acknowledgements First of all, I wish to express my greatest gratitude and deepest appreciation to my advisor, Prof. Changyun Wen, for his continuous support, professional guidance and sincere encouragement throughout my PhD study. Prof. Wen's serious sci- entific attitude, rigorous scholarship and optimistic outlook on life, would always inspire me to work harder and live happier in the future. This thesis would not be possible without his brilliant ideas and extraordinary drive for research. Secondly, I would like to express my special thanks to Dr. Mao Shan, Dr. Zhe Wei, Dr. Zhengguo Li and Dr. Yang Song for their instructions, encouragement and assistance in my Ph.D research. When I began my Ph.D study, Mao helped me to quickly get familiar with the important technologies involving robotics. He always disscussed with me patiently to help me find solutions when I encountered problems. After Mao left NTU, Zhe has helped me to conqure some hard issues of the }Smart Wheelchair} project and analysis the experimental results. Later, Zhengguo has guided me a lot on problem formulating and solving, he also shared me some valuable knowledeges and research directions. During my last years of Ph.D study, Yang has helped me in both theoretical and experimental studies. We have had pleasant and encouraging discussions about both the project and my Ph.D study. Overall, their kind supports helped me overcome lots of difficulties during my PhD study. Thirdly, I want to thank my colleagues and friends, Dr. Xiucai Huang, Dr. Renjie He, Dr. Fanghong Guo, Dr. Fei Kou, Ms. Ying Zou, Dr. Lantao Xing, Mr. Ruibin Jing, Dr. Jie Ding, Dr. Jingjing Huang and Dr. Hui Gao who are in Prof. Wen's group, Dr. Yuanzhe Wang, Mr. Chongxiao Wang, Mr. Yijie Zeng, Mr. Mok Bo Chuan, Dr. Yunyun Huang, Mr. Yan Xu, Mr. Kok Ming Lee, Mr. Paul Tan, Mr. Pek Kian Huat Alex and Mr. Song Guang Ho who are in ST Engineering-NTU Corporate Laboratory, for the experience of studying and working with them, and also for their generous help in countless experiments. xi xii Last but not least, I would like to express my deepest thanks to my parents, my wife as well as other family members, for their endless love and unswerving support. Abstract The unmanned ground vehicles (UGVs) have been applied to execute many im- portant tasks in the real world scenarios such as surveillance, exploring the hazard environment and autonomous transportation. The UGV is a complex system as it is integrated by several challenge technologies, such as simultaneously localization and mapping (SLAM), collision-free navigation, and robotic perception. Gener- ally, the navigation and control of UGVs in the Global Positioning System (GPS) denied environment (i.e., indoor scenario) are critically dependent on the SLAM system which provides localization service for UGVs, while the robotic perception endows UGVs the ability of understanding their surrounding environments, such as continuously tracking the moving obstacles and then filtering them out in the localization process. In this thesis, we concentrate on the two topics involving autonomously robotic systems, say SLAM and visual object tracking. The first part of this thesis focuses on visual object tracking, which is to generally estimate the motion state of the given target based on its appearance informa- tion. Though many promising tracking models have been proposed in the recent decade, some challenges are still waiting to be addressed, such as computational efficiency, tracking model drift due to illumination variation, motion blur, occlusion and deformation. Therefore, we address these issues by proposing two trackers: 1) Event-triggered tracking (ETT) framework which attempts to enable the tracking task to be carried out by an efficient short-term tracker (i.e., correlation filter based tracker) in most of the time while triggers to restore the short-term tracker once it fails to track the target, thus a balance between tracking accuracy and efficiency is achieved; 2) reliability re-determinative correlation filter (RRCF) which aims to take advantages from multiple feature representations to robustify the tracking model. Meanwhile, we propose two different weight solvers to adaptively adjust the importance of each feature. Extensive experiments have been designed on several large datasets to validate that: 1) the proposed tracking framework is superior to enhance the robustness of tracking model, 2) the proposed two weight solvers can xiii xiv effectively find the optimal weight for each feature. As expected, the proposed two trackers indeed improve the accuracy and robustness compared to the state-of-the- art trackers.
Recommended publications
  • AI, Robots, and Swarms: Issues, Questions, and Recommended Studies
    AI, Robots, and Swarms Issues, Questions, and Recommended Studies Andrew Ilachinski January 2017 Approved for Public Release; Distribution Unlimited. This document contains the best opinion of CNA at the time of issue. It does not necessarily represent the opinion of the sponsor. Distribution Approved for Public Release; Distribution Unlimited. Specific authority: N00014-11-D-0323. Copies of this document can be obtained through the Defense Technical Information Center at www.dtic.mil or contact CNA Document Control and Distribution Section at 703-824-2123. Photography Credits: http://www.darpa.mil/DDM_Gallery/Small_Gremlins_Web.jpg; http://4810-presscdn-0-38.pagely.netdna-cdn.com/wp-content/uploads/2015/01/ Robotics.jpg; http://i.kinja-img.com/gawker-edia/image/upload/18kxb5jw3e01ujpg.jpg Approved by: January 2017 Dr. David A. Broyles Special Activities and Innovation Operations Evaluation Group Copyright © 2017 CNA Abstract The military is on the cusp of a major technological revolution, in which warfare is conducted by unmanned and increasingly autonomous weapon systems. However, unlike the last “sea change,” during the Cold War, when advanced technologies were developed primarily by the Department of Defense (DoD), the key technology enablers today are being developed mostly in the commercial world. This study looks at the state-of-the-art of AI, machine-learning, and robot technologies, and their potential future military implications for autonomous (and semi-autonomous) weapon systems. While no one can predict how AI will evolve or predict its impact on the development of military autonomous systems, it is possible to anticipate many of the conceptual, technical, and operational challenges that DoD will face as it increasingly turns to AI-based technologies.
    [Show full text]
  • Geting (Parakilas and Bryce, 2018; ICRC, 2019; Abaimov and Martellini, 2020)
    Journal of Future Robot Life -1 (2021) 1–24 1 DOI 10.3233/FRL-200019 IOS Press Algorithmic fog of war: When lack of transparency violates the law of armed conflict Jonathan Kwik ∗ and Tom Van Engers Faculty of Law, University of Amsterdam, Amsterdam 1001NA, Netherlands Abstract. Underinternationallaw,weapon capabilitiesand theiruse areregulated by legalrequirementsset by International Humanitarian Law (IHL).Currently,there arestrong military incentivestoequip capabilitieswith increasinglyadvanced artificial intelligence (AI), whichinclude opaque (lesstransparent)models.As opaque models sacrifice transparency for performance, it is necessary toexaminewhether theiruse remainsinconformity with IHL obligations.First,wedemon- strate that theincentivesfor automationdriveAI toward complextaskareas and dynamicand unstructuredenvironments, whichinturnnecessitatesresorttomore opaque solutions.Wesubsequently discussthe ramifications of opaque models for foreseeability andexplainability.Then, we analysetheir impact on IHLrequirements froma development, pre-deployment and post-deployment perspective.We findthatwhile IHL does not regulate opaqueAI directly,the lack of foreseeability and explainability frustratesthe fulfilmentofkey IHLrequirementstotheextent that theuse of fully opaqueAI couldviolate internationallaw.Statesare urgedtoimplement interpretability duringdevelopmentand seriously consider thechallenging complicationofdetermining theappropriate balancebetween transparency andperformance in their capabilities. Keywords:Transparency, interpretability,foreseeability,weapon,
    [Show full text]
  • Variety of Research Work Has Already Been Done to Develop an Effective Navigation Systems for Unmanned Ground Vehicle
    Design of a Smart Unmanned Ground Vehicle for Hazardous Environments Saurav Chakraborty Subhadip Basu Tyfone Communications Development (I) Pvt. Ltd. Computer Science & Engineering. Dept. ITPL, White Field, Jadavpur University Bangalore 560092, INDIA. Kolkata – 700032, INDIA Abstract. A smart Unmanned Ground Vehicle needs some organizing principle, based on the (UGV) is designed and developed for some characteristics of each system such as: application specific missions to operate predominantly in hazardous environments. In The purpose of the development effort our work, we have developed a small and (often the performance of some application- lightweight vehicle to operate in general cross- specific mission); country terrains in or without daylight. The The specific reasons for choosing a UGV UGV can send visual feedbacks to the operator solution for the application (e.g., hazardous at a remote location. Onboard infrared sensors environment, strength or endurance can detect the obstacles around the UGV and requirements, size limitation etc.); sends signals to the operator. the technological challenges, in terms of functionality, performance, or cost, posed by Key Words. Unmanned Ground Vehicle, the application; Navigation Control, Onboard Sensor. The system's intended operating area (e.g., indoor environments, anywhere indoors, 1. Introduction outdoors on roads, general cross-country Robotics is an important field of interest in terrain, the deep seafloor, etc.); modern age of automation. Unlike human being the vehicle's mode of locomotion (e.g., a computer controlled robot can work with speed wheels, tracks, or legs); and accuracy without feeling exhausted. A robot How the vehicle's path is determined (i.e., can also perform preassigned tasks in a control and navigation techniques hazardous environment, reducing the risks and employed).
    [Show full text]
  • Increasing the Trafficability of Unmanned Ground Vehicles Through Intelligent Morphing
    Increasing the Trafficability of Unmanned Ground Vehicles through Intelligent Morphing Siddharth Odedra 1, Dr Stephen Prior 1, Dr Mehmet Karamanoglu 1, Dr Siu-Tsen Shen 2 1Product Design and Engineering, Middlesex University Trent Park Campus, Bramley Road, London N14 4YZ, U.K. [email protected] [email protected] [email protected] 2Department of Multimedia Design, National Formosa University 64 Wen-Hua Road, Hu-Wei 63208, YunLin County, Taiwan R.O.C. [email protected] Abstract Unmanned systems are used where humans are either 2. UNMANNED GROUND VEHICLES unable or unwilling to operate, but only if they can perform as Unmanned Ground Vehicles can be defined as good as, if not better than us. Systems must become more mechanised systems that operate on ground surfaces and autonomous so that they can operate without assistance, relieving the burden of controlling and monitoring them, and to do that they serve as an extension of human capabilities in unreachable need to be more intelligent and highly capable. In terms of ground or unsafe areas. They are used for many things such as vehicles, their primary objective is to be able to travel from A to B cleaning, transportation, security, exploration, rescue and where the systems success or failure is determined by its mobility, bomb disposal. UGV’s come in many different for which terrain is the key element. This paper explores the configurations usually defined by the task at hand and the concept of creating a more autonomous system by making it more environment they must operate in, and are either remotely perceptive about the terrain, and with reconfigurable elements, controlled by the user, pre-programmed to carry out specific making it more capable of traversing it.
    [Show full text]
  • Sensors and Measurements for Unmanned Systems: an Overview
    sensors Review Sensors and Measurements for Unmanned Systems: An Overview Eulalia Balestrieri 1,* , Pasquale Daponte 1, Luca De Vito 1 and Francesco Lamonaca 2 1 Department of Engineering, University of Sannio, 82100 Benevento, Italy; [email protected] (P.D.); [email protected] (L.D.V.) 2 Department of Computer Science, Modeling, Electronics and Systems (DIMES), University of Calabria, 87036 Rende, CS, Italy; [email protected] * Correspondence: [email protected] Abstract: The advance of technology has enabled the development of unmanned systems/vehicles used in the air, on the ground or on/in the water. The application range for these systems is continuously increasing, and unmanned platforms continue to be the subject of numerous studies and research contributions. This paper deals with the role of sensors and measurements in ensuring that unmanned systems work properly, meet the requirements of the target application, provide and increase their navigation capabilities, and suitably monitor and gain information on several physical quantities in the environment around them. Unmanned system types and the critical environmental factors affecting their performance are discussed. The measurements that these kinds of vehicles can carry out are presented and discussed, while also describing the most frequently used on-board sensor technologies, as well as their advantages and limitations. The paper provides some examples of sensor specifications related to some current applications, as well as describing the recent research contributions in the field. Citation: Balestrieri, E.; Daponte, P.; Keywords: unmanned systems; UAV; UGV; USV; UUV; sensors; payload; challenges De Vito, L.; Lamonaca, F. Sensors and Measurements for Unmanned Systems: An Overview.
    [Show full text]
  • Design of a Reconnaissance and Surveillance Robot
    DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT A THESIS SUBMITTED TO THE GRADUATE SCHOOL OF NATURAL AND APPLIED SCIENCES OF MIDDLE EAST TECHNICAL UNIVERSITY BY ERMAN ÇAĞAN ÖZDEMİR IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE IN MECHANICAL ENGINEERING AUGUST 2013 Approval of the thesis: DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT submitted by ERMAN ÇAĞAN ÖZDEMİR in partial fulfillment of the requirements for the degree of Master of Science in Mechanical Engineering Department, Middle East Technical University by, Prof. Dr. Canan Özgen _____________________ Dean, Graduate School of Natural and Applied Sciences Prof. Dr. Süha Oral _____________________ Head of Department, Mechanical Engineering Prof. Dr. Eres Söylemez _____________________ Supervisor, Mechanical Engineering Dept., METU Examining Committee Members: Prof. Dr. Kemal İder _____________________ Mechanical Engineering Dept., METU Prof. Dr. Eres Söylemez _____________________ Mechanical Engineering Dept., METU Prof. Dr. Kemal Özgören _____________________ Mechanical Engineering Dept., METU Ass. Prof. Dr. Buğra Koku _____________________ Mechanical Engineering Dept., METU Alper Erdener, M.Sc. _____________________ Project Manager, Unmanned Systems, ASELSAN Date: 29/08/2013 I hereby declare that all information in this document has been obtained and presented in accordance with academic rules and ethical conduct. I also declare that, as required by these rules and conduct, I have fully cited and referenced all material and results that are not original to this work. Name, Last name: Erman Çağan ÖZDEMİR Signature: iii ABSTRACT DESIGN OF A RECONNAISSANCE AND SURVEILLANCE ROBOT Özdemir, Erman Çağan M.Sc., Department of Mechanical Engineering Supervisor: Prof. Dr. Eres Söylemez August 2013, 72 pages Scope of this thesis is to design a man portable robot which is capable of carrying out reconnaissance and surveillance missions.
    [Show full text]
  • US Ground Forces Robotics and Autonomous Systems (RAS)
    U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI): Considerations for Congress Updated November 20, 2018 Congressional Research Service https://crsreports.congress.gov R45392 U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI) Summary The nexus of robotics and autonomous systems (RAS) and artificial intelligence (AI) has the potential to change the nature of warfare. RAS offers the possibility of a wide range of platforms—not just weapon systems—that can perform “dull, dangerous, and dirty” tasks— potentially reducing the risks to soldiers and Marines and possibly resulting in a generation of less expensive ground systems. Other nations, notably peer competitors Russia and China, are aggressively pursuing RAS and AI for a variety of military uses, raising considerations about the U.S. military’s response—to include lethal autonomous weapons systems (LAWS)—that could be used against U.S. forces. The adoption of RAS and AI by U.S. ground forces carries with it a number of possible implications, including potentially improved performance and reduced risk to soldiers and Marines; potential new force designs; better institutional support to combat forces; potential new operational concepts; and possible new models for recruiting and retaining soldiers and Marines. The Army and Marines have developed and are executing RAS and AI strategies that articulate near-, mid-, and long-term priorities. Both services have a number of RAS and AI efforts underway and are cooperating in a number of areas. A fully manned, capable, and well-trained workforce is a key component of military readiness. The integration of RAS and AI into military units raises a number of personnel-related issues that may be of interest to Congress, including unit manning changes, recruiting and retention of those with advanced technical skills, training, and career paths.
    [Show full text]
  • Autonomous Unmanned Ground Vehicle and Indirect Driving
    Autonomous Unmanned Ground Vehicle and Indirect Driving ABSTRACT This paper describes the design and challenges faced in the development of an Autonomous Unmanned Ground Vehicle (AUGV) demonstrator and an Indirect Driving Testbed (IDT). The AUGV has been modified from an M113 Armoured Personnel Carrier and has capabilities such as remote control, autonomous waypoint seeking, obstacle avoidance, road following and vehicle following. The IDT also enables the driver of an armoured vehicle to drive by looking at live video feed from cameras mounted on the exterior of the vehicle. These two technologies are complementary, and integration of the two technology enablers could pave the way for the deployment of semi-autonomous unmanned (and even manned) vehicles in battles. Ng Keok Boon Tey Hwee Choo Chan Chun Wah Autonomous Unmanned Ground Vehicle and Indirect Driving ABSTRACT This paper describes the design and challenges faced in the development of an Autonomous Unmanned Ground Vehicle (AUGV) demonstrator and an Indirect Driving Testbed (IDT). The AUGV has been modified from an M113 Armoured Personnel Carrier and has capabilities such as remote control, autonomous waypoint seeking, obstacle avoidance, road following and vehicle following. The IDT also enables the driver of an armoured vehicle to drive by looking at live video feed from cameras mounted on the exterior of the vehicle. These two technologies are complementary, and integration of the two technology enablers could pave the way for the deployment of semi-autonomous unmanned (and even manned) vehicles in battles. Ng Keok Boon Tey Hwee Choo Chan Chun Wah Autonomous Unmanned Ground Vehicle and Indirect Driving 54 AUGV has been tested.
    [Show full text]
  • Motion Control Design for Unmanned Ground Vehicle in Dynamic Environment Using Intelligent Controller
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Sussex Research Online Motion Control Design for Unmanned Ground Vehicle in Dynamic Environment Using Intelligent Controller Auday Al-Mayyahi, Weiji Wang, Alaa Hussien and Philip Birch Department of Engineering and Design, University of Sussex, Brighton, United Kingdom Abstract Purpose - The motion control of unmanned ground vehicles is a challenge in the industry of automation. In this paper, a fuzzy inference system based on sensory information is proposed for the purpose of solving the navigation challenge of unmanned ground vehicles in cluttered and dynamic environments. Design/methodology/approach - The representation of the dynamic environment is a key element for the operational field and for the testing of the robotic navigation system. If dynamic obstacles move randomly in the operation field, the navigation problem becomes more complicated due to the coordination of the elements for accurate navigation and collision-free path within the environmental representations. This paper considers the construction of the fuzzy inference system which consists of two controllers. The first controller uses three sensors based on the obstacles distances from the front, right and left. The second controller employs the angle difference between the heading of the vehicle and the targeted angle to obtain the optimal route based on the environment and reach the desired destination with minimal running power and delay. The proposed design shows an efficient navigation strategy that overcomes the current navigation challenges in dynamic environments. Findings - Experimental analyses conducted for three different scenarios to investigate the validation and effectiveness of the introduced controllers based on the fuzzy inference system.
    [Show full text]
  • Robotics in Scansorial Environments
    University of Pennsylvania ScholarlyCommons Departmental Papers (ESE) Department of Electrical & Systems Engineering 2005 Robotics in Scansorial Environments Kellar Autumn Lewis & Clark College Martin Buehler Mark Cutkosky Stanford University Ronald Fearing University of California - Berkeley Robert J. Full University of California - Berkeley See next page for additional authors Follow this and additional works at: https://repository.upenn.edu/ese_papers Part of the Electrical and Computer Engineering Commons, and the Systems Engineering Commons Recommended Citation Kellar Autumn, Martin Buehler, Mark Cutkosky, Ronald Fearing, Robert J. Full, Daniel Goldman, Richard Groff, William Provancher, Alfred A. Rizzi, Uluc Saranli, Aaron Saunders, and Daniel Koditschek, "Robotics in Scansorial Environments", Proceedings of SPIE 5804, 291-302. January 2005. http://dx.doi.org/ 10.1117/12.606157 This paper is posted at ScholarlyCommons. https://repository.upenn.edu/ese_papers/877 For more information, please contact [email protected]. Robotics in Scansorial Environments Abstract We review a large multidisciplinary effort to develop a family of autonomous robots capable of rapid, agile maneuvers in and around natural and artificial erv tical terrains such as walls, cliffs, caves, trees and rubble. Our robot designs are inspired by (but not direct copies of) biological climbers such as cockroaches, geckos, and squirrels. We are incorporating advanced materials (e.g., synthetic gecko hairs) into these designs and fabricating them using state of the art rapid prototyping techniques (e.g., shape deposition manufacturing) that permit multiple iterations of design and testing with an effective integration path for the novel materials and components. We are developing novel motion control techniques to support dexterous climbing behaviors that are inspired by neuroethological studies of animals and descended from earlier frameworks that have proven analytically tractable and empirically sound.
    [Show full text]
  • Indoor Navigation of Unmanned Grounded Vehicle Using CNN
    International Journal of Recent Technology and Engineering (IJRTE) ISSN: 2277-3878, Volume-8 Issue-6, March 2020 Indoor Navigation of Unmanned Grounded Vehicle using CNN Arindam Jain, Ayush Singh, Deepanshu Bansal, Madan Mohan Tripathi also a cost effective choice compared to its alternatives Abstract—This paper presents a hardware and software such as LIDAR. architecture for an indoor navigation of unmanned ground The main objective of Convolutional Neural vehicles. It discusses the complete process of taking input from Network(CNN) is to map the characteristics of input image the camera to steering the vehicle in a desired direction. Images taken from a single front-facing camera are taken as input. We data to an output variable. With drastic improvement in have prepared our own dataset of the indoor environment in computational power and large amounts of data available order to generate data for training the network. For training, the nowadays, they have proven to be tremendously useful. images are mapped with steering directions, those are, left, right, CNN has been long used for digital image processing and forward or reverse. The pre-trained convolutional neural pattern recognition. network(CNN) model then predicts the direction to steer in. The This work is motivated by two primary objectives: one to model then gives this output direction to the microprocessor, which in turn controls the motors to transverse in that direction. develop a self-driving vehicle in minimum possible budget With minimum amount of training data and time taken for for hardware components. Nowadays, a lot of research is training, very accurate results were obtained, both in the done towards the development of self-driving vehicles.
    [Show full text]
  • Advanced C2 Vehicle Crewstation for Control of Unmanned Ground Vehicles
    UNCLASSIFIED/UNLIMITED Advanced C2 Vehicle Crewstation for Control of Unmanned Ground Vehicles David Dahn Bruce Brendle Marc Gacy, PhD U.S. Army Tank-automotive & Micro Analysis & Design Inc. Armaments Command 4949 Pearl East Circle, Ste 300 AMSTA-TR-R (MS 264: Brendle) Boulder, CO 80301 Warren, MI 48397-5000 UNITED STATES UNITED STATES Tele: (303) 442-6947 Phone: 586 574-5798 Fax: (303) 442-8274 Fax: 586 574-8684 Email: [email protected] / [email protected] [email protected] 1.0 INTRODUCTION 1.1 Rationale Future military operational requirements may utilize unmanned or unattended assets for certain military missions that currently place military personnel in harms way for performing jobs and tasks that can be accomplished with robotic assets. To answer these needs, the US Army established two advanced technology demonstration (ATD) programs, the Crew integration & Automation Testbed (CAT) and the Robotic Follower (RF), for the development, integration and demonstration of technologies for reduced crew operations and unmanned vehicle control. The Tank Automotive Research, Development and Engineering Center (TARDEC) of the U.S. Army Tank-automotive and Armaments Command (TACOM) is executing these programs together in an effort entitled Vetronics Technology Integration (VTI). VTI comprises a number of different commercial, and government participants working research issues for near and far term vehicular electronics and robotic challenges. 1.2 Context The use of robotics in the near term will be through soldier-robot teams. A great deal of research is being conducted into developing semi-autonomous robotic systems and our research and development efforts are focused on defining new interfaces to empower the warfighter to employ these autonomous systems while conducting their existing mission (Dahn and Gacy, 2002).
    [Show full text]