A Survey of Behavior Learning Applications in Robotics
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Global Education Entry Points Into the Curriculum: a Guide for Teacher Librarians
DOCUMENT RESUME ED 415 155 SO 028 109 AUTHOR Blakey, Elaine; Maguire, Gerry; Steward, Kaye TITLE Global Education Entry Points into the Curriculum: A Guide for Teacher Librarians. INSTITUTION Alberta Teachers Association, Edmonton. Learning Resources Council.; Alberta Global Education Project, Edmonton. PUB DATE 1991-00-00 NOTE 74p. AVAILABLE FROM Alberta Global Education Project, 11010 142 Street, Edmonton, Alberta T5N 2R1 Canada. PUB TYPE Guides Non-Classroom (055) EDRS PRICE MF01/PC03 Plus Postage. DESCRIPTORS Elementary Secondary Education; Foreign Countries; *Global Approach; *Global Education; Interdisciplinary Approach; *Justice; *Peace; *Resource Materials; Social Studies; Sustainable Development IDENTIFIERS Alberta ABSTRACT This information packet is useful to teacher-librarians and teachers who would like to integrate global education concepts into existing curricula. The techniques outlined in this document provide strategies for implementing global education integration. The central ideas of the global education package include:(1) interrelatedness;(2) peace;(3) global community;(4) cooperation;(5) distribution and sustainable development; (6) multicultural understanding;(7) human rights;(8) stewardship; (9) empowerment; and (10) social justice. Throughout the packet, ideas are offered for inclusion of global perspectives in language arts, science, mathematics, and social studies. Recommendations are included for purchases of resource materials and cross reference charts for concepts across grades and curriculum areas. (EH) -
AI, Robots, and Swarms: Issues, Questions, and Recommended Studies
AI, Robots, and Swarms Issues, Questions, and Recommended Studies Andrew Ilachinski January 2017 Approved for Public Release; Distribution Unlimited. This document contains the best opinion of CNA at the time of issue. It does not necessarily represent the opinion of the sponsor. Distribution Approved for Public Release; Distribution Unlimited. Specific authority: N00014-11-D-0323. Copies of this document can be obtained through the Defense Technical Information Center at www.dtic.mil or contact CNA Document Control and Distribution Section at 703-824-2123. Photography Credits: http://www.darpa.mil/DDM_Gallery/Small_Gremlins_Web.jpg; http://4810-presscdn-0-38.pagely.netdna-cdn.com/wp-content/uploads/2015/01/ Robotics.jpg; http://i.kinja-img.com/gawker-edia/image/upload/18kxb5jw3e01ujpg.jpg Approved by: January 2017 Dr. David A. Broyles Special Activities and Innovation Operations Evaluation Group Copyright © 2017 CNA Abstract The military is on the cusp of a major technological revolution, in which warfare is conducted by unmanned and increasingly autonomous weapon systems. However, unlike the last “sea change,” during the Cold War, when advanced technologies were developed primarily by the Department of Defense (DoD), the key technology enablers today are being developed mostly in the commercial world. This study looks at the state-of-the-art of AI, machine-learning, and robot technologies, and their potential future military implications for autonomous (and semi-autonomous) weapon systems. While no one can predict how AI will evolve or predict its impact on the development of military autonomous systems, it is possible to anticipate many of the conceptual, technical, and operational challenges that DoD will face as it increasingly turns to AI-based technologies. -
Design and Evaluation of Modular Robots for Maintenance in Large Scientific Facilities
UNIVERSIDAD POLITÉCNICA DE MADRID ESCUELA TÉCNICA SUPERIOR DE INGENIEROS INDUSTRIALES DESIGN AND EVALUATION OF MODULAR ROBOTS FOR MAINTENANCE IN LARGE SCIENTIFIC FACILITIES PRITHVI SEKHAR PAGALA, MSC 2014 DEPARTAMENTO DE AUTOMÁTICA, INGENIERÍA ELECTRÓNICA E INFORMÁTICA INDUSTRIAL ESCUELA TÉCNICA SUPERIOR DE INGENIEROS INDUSTRIALES DESIGN AND EVALUATION OF MODULAR ROBOTS FOR MAINTENANCE IN LARGE SCIENTIFIC FACILITIES PhD Thesis Author: Prithvi Sekhar Pagala, MSC Advisors: Manuel Ferre Perez, PhD Manuel Armada, PhD 2014 DESIGN AND EVALUATION OF MODULAR ROBOTS FOR MAINTENANCE IN LARGE SCIENTIFIC FACILITIES Author: Prithvi Sekhar Pagala, MSC Tribunal: Presidente: Dr. Fernando Matía Espada Secretario: Dr. Claudio Rossi Vocal A: Dr. Antonio Giménez Fernández Vocal B: Dr. Juan Antonio Escalera Piña Vocal C: Dr. Concepción Alicia Monje Micharet Suplente A: Dr. Mohamed Abderrahim Fichouche Suplente B: Dr. José Maráa Azorín Proveda Acuerdan otorgar la calificación de: Madrid, de de 2014 Acknowledgements The duration of the thesis development lead to inspiring conversations, exchange of ideas and expanding knowledge with amazing individuals. I would like to thank my advisers Manuel Ferre and Manuel Armada for their constant men- torship, support and motivation to pursue different research ideas and collaborations during the course of entire thesis. The team at the lab has not only enriched my professionally life but also in Spanish ways. Thank you all the members of the ROMIN, Francisco, Alex, Jose, Jordi, Ignacio, Javi, Chema, Luis .... This research project has been supported by a Marie Curie Early Stage Initial Training Network Fellowship of the European Community’s Seventh Framework Program "PURESAFE". I wish to thank the supervisors and fellow research members of the project for the amazing support, fruitful interactions and training events. -
Learning for Microrobot Exploration: Model-Based Locomotion, Sparse-Robust Navigation, and Low-Power Deep Classification
Learning for Microrobot Exploration: Model-based Locomotion, Sparse-robust Navigation, and Low-power Deep Classification Nathan O. Lambert1, Farhan Toddywala1, Brian Liao1, Eric Zhu1, Lydia Lee1, and Kristofer S. J. Pister1 Abstract— Building intelligent autonomous systems at any Classification Intelligent, mm scale Fast Downsampling scale is challenging. The sensing and computation constraints Microrobot of a microrobot platform make the problems harder. We present improvements to learning-based methods for on-board learning of locomotion, classification, and navigation of microrobots. We show how simulated locomotion can be achieved with model- Squeeze-and-Excite Hard Activation System-on-chip: based reinforcement learning via on-board sensor data distilled Camera, Radio, Battery into control. Next, we introduce a sparse, linear detector and a Dynamic Thresholding method to FAST Visual Odometry for improved navigation in the noisy regime of mm scale imagery. Locomotion Navigation We end with a new image classifier capable of classification Controller Parameter Unknown Dynamics Original Training Image with fewer than one million multiply-and-accumulate (MAC) Optimization Modeling & Simulator Resulting Map operations by combining fast downsampling, efficient layer Reward structures and hard activation functions. These are promising … … SLIPD steps toward using state-of-the-art algorithms in the power- Ground Truth Estimated Pos. State Space limited world of edge-intelligence and microrobots. Sparse I. INTRODUCTION Local Control PID: K K K Microrobots have been touted as a coming revolution p d i for many tasks, such as search and rescue, agriculture, or Fig. 1: Our vision for microrobot exploration based on distributed sensing [1], [2]. Microrobotics is a synthesis of three contributions: 1) improving data-efficiency of learning Microelectromechanical systems (MEMs), actuators, power control, 2) a more noise-robust and novel approach to visual electronics, and computation. -
Robot Learning
Robot Learning 15-494 Cognitive Robotics David S. Touretzky & Ethan Tira-Thompson Carnegie Mellon Spring 2009 04/06/09 15-494 Cognitive Robotics 1 What Can Robots Learn? ● Parameter tuning, e.g., for a faster walk ● Perceptual learning: ALVINN driving the Navlab ● Map learning, e.g., SLAM algorithms ● Behavior learning; plans and macro-operators – Shakey the Robot (SRI) – Robo-Soar ● Learning from human teachers – Operant conditioning: Skinnerbots – Imitation learning 04/06/09 15-494 Cognitive Robotics 2 Lots of Work on Robot Learning ● IEEE Robotics and Automation Society – Technical Committee on Robot Learning – http://www.learning-robots.de ● Robot Learning Summer School – Lisbon, Portugal; July 20-24, 2009 ● Workshops at major robotics conferences – ICRA 2009 workshop: Approachss to Sensorimotor Learning on Humanoid Robots – Kobe, Japan; May 17, 2009 04/06/09 15-494 Cognitive Robotics 3 Parameter Optimization ● How fast can an AIBO walk? Figures from Kohl & Stone, ICRA 2004, for the ERS-210 model: – CMU (2002) 200 mm/s – German Team 230 mm/s Hand-tuned gaits – UT Austin Villa 245 mm/s – UNSW 254 mm/s – Hornsby (1999) 170 mm/s – UNSW 270 mm/s Learned gaits – UT Austin Villa 291 mm/s 04/06/09 15-494 Cognitive Robotics 4 Walk Parameters 12 parameters to optimize: ● Front locus (height, x pos, ypos) ● Rear locus (height, x pos, y pos) ● Locus length ● Locus skew multiplier (in the x-y plane, for turning) ● Height of front of body ● Height of rear of body From Kohl & Stone (ICRA 2004) ● Foot travel time ● Fraction of time foot is on ground 04/06/09 15-494 Cognitive Robotics 5 Optimization Strategy ● “Policy gradient reinforcement learning”: – Walk parameter assignment = “policy” – Estimate the gradient along each dimension by trying combinations of slight perturbations in all parameters – Measure walking speed on the actual robot – Optimize all 12 parameters simultaneously – Adjust parameters according to the estimated gradient. -
Systematic Review of Research Trends in Robotics Education for Young Children
sustainability Review Systematic Review of Research Trends in Robotics Education for Young Children Sung Eun Jung 1,* and Eun-sok Won 2 1 Department of Educational Theory and Practice, University of Georgia, Athens, GA 30602, USA 2 Department of Liberal Arts Education, Mokwon University, Daejeon 35349, Korea; [email protected] * Correspondence: [email protected]; Tel.: +1-706-296-3001 Received: 31 January 2018; Accepted: 13 March 2018; Published: 21 March 2018 Abstract: This study conducted a systematic and thematic review on existing literature in robotics education using robotics kits (not social robots) for young children (Pre-K and kindergarten through 5th grade). This study investigated: (1) the definition of robotics education; (2) thematic patterns of key findings; and (3) theoretical and methodological traits. The results of the review present a limitation of previous research in that it has focused on robotics education only as an instrumental means to support other subjects or STEM education. This study identifies that the findings of the existing research are weighted toward outcome-focused research. Lastly, this study addresses the fact that most of the existing studies used constructivist and constructionist frameworks not only to design and implement robotics curricula but also to analyze young children’s engagement in robotics education. Relying on the findings of the review, this study suggests clarifying and specifying robotics-intensified knowledge, skills, and attitudes in defining robotics education in connection to computer science education. In addition, this study concludes that research agendas need to be diversified and the diversity of research participants needs to be broadened. To do this, this study suggests employing social and cultural theoretical frameworks and critical analytical lenses by considering children’s historical, cultural, social, and institutional contexts in understanding young children’s engagement in robotics education. -
Hyundai Motor Group to Acquire Controlling Interest in Boston Dynamics from Softbank Group, Opening a New Chapter in the Robotics and Mobility Industry
Hyundai Motor Group to Acquire Controlling Interest in Boston Dynamics from SoftBank Group, Opening a New Chapter in the Robotics and Mobility Industry • Hyundai Motor Group to acquire controlling interest in Boston Dynamics, valued at $1.1 billion, with the goal of advancing robotics and mobility to realize progress for humanity • The combination of the highly complementary technologies of Hyundai Motor Group and Boston Dynamics, and the continued partnership of SoftBank Group, will propel development and commercialization of advanced robots • The robotics technologies will lend synergies to autonomous vehicles, UAMs and smart factories • Hyundai Motor Group, together with Boston Dynamics, will create robotics value chain ranging from robot component manufacturing to smart logistics solutions BOSTON/SEOUL/TOKYO, December 11, 2020 – Hyundai Motor Group and SoftBank Group Corp. (SoftBank) today agreed on main terms of the transaction pursuant to which Hyundai Motor Group will acquire a controlling interest in Boston Dynamics in a deal that values the mobile robot firm at $1.1 billion. The deal came as Hyundai Motor Group envisions the transformation of human life by combining world-leading robotics technologies with its mobility expertise. Financial terms were not disclosed. Under the agreement, Hyundai Motor Group will hold an approximately 80% stake in Boston Dynamics and SoftBank, through one of its affiliates, will retain an approximately 20% stake in Boston Dynamics after the closing of the transaction. Hyundai Motor Group’s affiliates - Hyundai Motor Co., Hyundai Mobis Co. and Hyundai Glovis Co. - and Hyundai Motor Group Chairman Euisun Chung respectively participated in the acquisition. By establishing a leading presence in the field of robotics, the acquisition will mark another major step for Hyundai Motor Group toward its strategic transformation into a Smart Mobility Solution Provider. -
Multi-Leapmotion Sensor Based Demonstration for Robotic Refine Tabletop Object Manipulation Task
Available online at www.sciencedirect.com ScienceDirect CAAI Transactions on Intelligence Technology 1 (2016) 104e113 http://www.journals.elsevier.com/caai-transactions-on-intelligence-technology/ Original article Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task* Haiyang Jin a,b,c, Qing Chen a,b, Zhixian Chen a,b, Ying Hu a,b,*, Jianwei Zhang c a Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China b Chinese University of Hong Kong, Hong Kong, China c University of Hamburg, Hamburg, Germany Available online 2 June 2016 Abstract In some complicated tabletop object manipulation task for robotic system, demonstration based control is an efficient way to enhance the stability of execution. In this paper, we use a new optical hand tracking sensor, LeapMotion, to perform a non-contact demonstration for robotic systems. A Multi-LeapMotion hand tracking system is developed. The setup of the two sensors is analyzed to gain a optimal way for efficiently use the informations from the two sensors. Meanwhile, the coordinate systems of the Mult-LeapMotion hand tracking device and the robotic demonstration system are developed. With the recognition to the element actions and the delay calibration, the fusion principles are developed to get the improved and corrected gesture recognition. The gesture recognition and scenario experiments are carried out, and indicate the improvement of the proposed Multi-LeapMotion hand tracking system in tabletop object manipulation task for robotic demonstration. Copyright © 2016, Chongqing University of Technology. Production and hosting by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). -
Creative Robotics Studio Worcester Polytechnic Institute
Creative Robotics Studio An Interactive Qualifying Project Report submitted to the faculty of the Worcester Polytechnic Institute in partial fulfillment of the requirements for the degree of Bachelor of Science. By: Graham Held Walter Ho Paul Raynes Harrison Vaporciyan Project Advisers: Scott Barton Craig Putnam Joshua Rosenstock Creative Robotics Studio 2 Contents Contents .................................................................................................................................... 2 Abstract...................................................................................................................................... 4 Acknowledgements .................................................................................................................... 5 Introduction ................................................................................................................................ 6 1 Background ......................................................................................................................... 8 1.1 Relationship between Robots and Humans ................................................................. 8 1.1.1 “Human Replacement” .......................................................................................... 8 1.1.2 Humanizing Robots - Anthropomorphism .............................................................. 9 1.1.3 Uncanny Valley ....................................................................................................11 1.1.4 Computers and Digital -
Machine Vision for Service Robots & Surveillance
Machine Vision for Service Robots & Surveillance Prof.dr.ir. Pieter Jonker EMVA Conference 2015 Delft University of Technology Cognitive Robotics (text intro) This presentation addresses the issue that machine vision and learning systems will dominate our lives and work in future, notably through surveillance systems and service robots. Pieter Jonker’s specialism is robot vision; the perception and cognition of service robots, more specifically autonomous surveillance robots and butler robots. With robot vision comes the artificial intelligence; if you perceive and understand, than taking sensible actions becomes relative easy. As robots – or autonomous cars, or … - can move around in the world, they encounter various situations to which they have to adapt. Remembering in which situation you adapted to what is learning. Learning comes in three flavors: cognitive learning through Pattern Recognition (association), skills learning through Reinforcement Learning (conditioning) and the combination (visual servoing); such as a robot learning from observing a human how to poor in a glass of beer. But as always in life this learning comes with a price; bad teachers / bad examples. 2 | 16 Machine Vision for Service Robots & Surveillance Prof.dr.ir. Pieter Jonker EMVA Conference Athens 12 June 2015 • Professor of (Bio) Mechanical Engineering Vision based Robotics group, TU-Delft Robotics Institute • Chairman Foundation Living Labs for Care Innovation • CEO LEROVIS BV, CEO QdepQ BV, CTO Robot Care Systems Delft University of Technology Content • -
Opportunities in the 5G Buildout Miral Kim-E
Opportunities in the 5G Buildout Miral Kim-E September 2019 Opportunities in the 5G Buildout Frontier Global Partners INTRODUCTION The fundamental demand for advanced telecommuni- With each new advance in microprocessor and cations is embedded in our DNA. People want and memory chip technology, the PC industry saw rapid need to communicate. Whether by prehistoric cave development in hardware and software. When the in- drawings, messenger pigeons in the Middle Ages, or ternet became commercially available as the “World radio waves since 1896, the drive to improve the Wide Web” in 1990, the PC and telecom industries speed, breadth and affordability of communications has became inexorably linked. The internet became a spawned countless industries and defined generations. must-have telecommunications service (remember those dial-up modems?) and the personal computer Our focus is on the latest iteration of the evolution: was the only way to get it. It also marked the begin- 5G or the Fifth Generation of mobile telephony. ning of the shift from voice communications on the telecom network to data communications. To truly understand where telecommunications are going, especially in the 5G era, one has to understand These PC technologies helped mobile phones evolve computing. When the internet added cheap, world- from that first 2-lb Motorola handheld to the pow- wide communications to the power of programming erful smartphones of today. Like PCs, the use of and data analysis on a personal computer, a new era mobile phones has shifted from its original intent as was born. When that power was transferred to mo- a tool for voice communication to a tool for data bile phones, new industries were born. -
SLAM and Vision-Based Humanoid Navigation Olivier Stasse
SLAM and Vision-based Humanoid Navigation Olivier Stasse To cite this version: Olivier Stasse. SLAM and Vision-based Humanoid Navigation. Humanoid Robotics: A Reference, pp.1739-1761, 2018, 978-94-007-6047-9. 10.1007/978-94-007-6046-2_59. hal-01674512 HAL Id: hal-01674512 https://hal.archives-ouvertes.fr/hal-01674512 Submitted on 3 Jan 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. SLAM and Vision-based Humanoid Navigation Olivier STASSE, LAAS-CNRS 1 Motivations In order for humanoid robots to evolve autonomously in a complex environment, they have to perceive it, build an appropriate representation, localize in it and decide which motion to realize. The relationship between the environment and the robot is rather complex as some parts are obstacles to avoid, other possi- ble support for locomotion, or objects to manipulate. The affordance with the objects and the environment may result in quite complex motions ranging from bi-manual manipulation to whole-body motion generation. In this chapter, we will introduce tools to realize vision-based humanoid navigation. The general structure of such a system is depicted in Fig. 1.