Genoa Giorgio Metta & the Icub Team

Genoa Giorgio Metta & the Icub Team

iCub a shared platform for research in robotics & AI Genoa June 25, 2015 Giorgio Metta & the iCub team Istituto Italiano di Tecnologia Via Morego, 30 - Genoa, Italy we have a dream 6/25/2015 3 the iCub price: 250K€ 30 iCub distributed since 2008 about 3-4 iCub’s/year 6/25/2015 4 why is the iCub special? • hands: we started the design from the hands – 5 fingers, 9 degrees of freedom, 19 joints • sensors: human-like, e.g. no lasers – cameras, microphones, gyros, encoders, force, tactile… • electronics: flexibility for research – custom electronics, small, programmable (DSPs) • reproducible platform: community designed − reproducible & maintainable yet evolvable platform − large software repository (~2M lines of code) 6/25/2015 5 why humanoids? • scientific reasons – e.g. elephants don’t play chess • natural human-robot interaction • challenging mechatronics • fun! 6/25/2015 6 why open source? • repeatable experiments • benchmarking • quality this resonates with industry-grade R&D in robotics 6/25/2015 7 open source 6/25/2015 8 6/25/2015 9 series-elastic actuators • C spring design • 320Nm/rad stiffness • features: – stiffness by design – no preloading necessary (easy assembly) – requires only 4 custom mechanical parts – high resolution encoder for torque sensing 6/25/2015 10 Yet Another Robot Platform • YARP is an open-source (LGPL) middleware for humanoid robotics •history 2001-2002 – an MIT / Univ. of Genoa collaboration 2000-2001 – born on Kismet, grew on COG, under QNX – with a major overhaul, now used by the iCub project 2003 • C++ source code (some 400K lines) • IPC & hardware interface 2004-Today • portable across OSs and development platforms 6/25/2015 11 exploit diversity: portability • operating system portability: – Adaptive Communication Environment, C++ OS wrapper: e.g. threads, semaphores, sockets • development environment portability: –CMake • language portability: – via Swig: Java (Matlab), Perl, Python, C# C/C++ library C/C++ C/C++ library library C/C++ library 6/25/2015 12 wiki & manual SVN & GIT part lists drawings 6/25/2015 13 iCub sensors 6/25/2015 14 torque control 0 ∙ ̂ ∙ ̂ ∙ ∙ ∙ ∙ ∙ 6/25/2015 15 learning dynamics • learning body dynamics – compute external forces – implement compliant control • so far we did it starting from e.g. the cad models – but we’d like to avoid it 6/25/2015 16 our method in 4 easy steps • regularized least square f x wT x 2 1 2 minJ w y Xw w 2 2 1 w I X T XT X y • kernelized m f x ci k , xi x i 1 c K I1 y 1 D k x, x E z xT z x • approximate kernel i j wd i wd j D d 1 T T zw x cos w x , sin w x T 1 T • ake it incrementalmw I y + Cholesky rank-1 update 6/25/2015 17 properties • O(1) update complexity w.r.t. # training samples • exact batch solution after each update • dimensionality of feature mapping trades computation for approximation accuracy • O(D²) time and space complexity per update w.r.t. dimensionality of feature mapping • easy to understand/implement (few lines of code) • not exclusively for dynamics/robotics learning! 6/25/2015 18 batch experiments • 3 inverse dynamics datasets: Sarcos, Simulated Sarcos, Barrett [Nguyen-Tuong et al., 2009] • approximately 15k training and 5k test samples • comparison with LWPR, GPR, LGP, Kernel RR • RFRR with 500, 1000, 2000 random features • hyperparameter optimization by exploiting functional similarity with GPR (log marginal likelihood optimization) 6/25/2015 19 datasets 6/25/2015 20 6/25/2015 21 6/25/2015 22 6/25/2015 23 incremental experiments 6/25/2015 24 incremental experiments 6/25/2015 25 temperature compensation 6/25/2015 26 summary • Fast prediction and model update of RFRR – 200RF: 400μs – 500RF: 2ms – 1000RF: 7ms • Non-stationary: thermal sensor drift in force components • Rapid convergence of RFRR • No further gain by using additional random features (problem specific) 6/25/2015 27 experiments & model validation static configuration: an additional six axis F/T sensor is placed at the end effector to measures the external wrenches we in this experiment we consider the following quantities: • joint torques measured by the joint torque sensors: tj • joint torques computed from the arm F/T sensor: tFT • joint torques estimated thought the additional F/T sensor located T at the end effector: te=J we • joint torques predicted by the arm model (no external forces): tm additional F/T sensor at Joint 0 Joint 1 Joint 2 Joint 3 E(τ-τ ) 0.127 Nm -0.049 Nm -0.002 Nm -0.032 Nm the end-effector j ft σ(τj-τft) 0.186 Nm 0.131 Nm 0.013 Nm 0.042 Nm E(τj-(τm+τe)) 0.075 Nm -0.098 Nm -0.006 Nm 0.006 Nm σ(τj-(τm+τe)) 0.191 Nm 0.173 Nm 0.020 Nm 0.032 Nm 6/25/2015 28 the robot skin ground plane: e.g. conductive fabric parameters: mechanical properties, impedance, etc. soft material: e.g. silicone parameters: dielectric constant, mechanical stiffness, etc. capacitor electrodes: etched on a flexible PCB parameters: shape, folding, etc. 6/25/2015 29 principle lots of sensing points structure of the skin 6/25/2015 30 tests of various materials 120 Soma Foama Hard PDMS Soma Foama Soft 100 Neoprene2mm SpongeRubberLycra NeopreneGrid2mm 80 NeoGridNeobulkLycra NeoGridNeobulkLycra2 Neoprene2mmLycra 60 NeopreneGrid3mm C [bit] C 40 Others (discarded) 20 0 0 2 4 6 8 10 12 14 16 18 LC [kPa] 6/25/2015 31 latest implementation advantages: • good performance: gluing is made with industrial machines, no hysteresis due to glue • production: automatic and reliable • mounting and replacing is easy, easy ground connection • protective layer can be of different materials increase reliability • customizations: surface can be printed 6/25/2015 32 6/25/2015 33 skin calibration performed manually by poking the robot with a tool works iteratively with different datasets taken in different robot positions A. Del Prete, S. Denei, L. Natale, F. Mastrogiovanni, F. Nori, G. Cannata, and G. Metta. “Skin spatial calibration using force/torque measurements”, in Intelligent Robots and Systems (IROS), 2011 6/25/2015 34 6/25/2015 35 6/25/2015 36 6/25/2015 37 project CoDyCo, Nori et al. 6/25/2015 38 floating base robots 6/25/2015 39 6/25/2015 40 6/25/2015 41 6/25/2015 42 6/25/2015 43 point to point movements 6/25/2015 44 ipopt ∈ . • quick convergence (<20ms) • scalability • singularities and joints bound handling • tasks hierarchy • complex constraints 1 min , 2 . ∙ • merges joint and Cartesian space trajectories 6/25/2015 45 6/25/2015 46 6/25/2015 47 6/25/2015 48 Could you please help Please put me with the those into the TV set? dishwashing machine The iCub puts the plates into the dishwashing machine Trueswell, J. C., & Gleitman, L. R. (2007) Learning to parse and its implications for language acquisition, in G. Gaskell (ed.) Oxford Handbook of Psycholing. Oxford: Oxford Univ. Press. 6/25/2015 49 The iCub puts the plates into the dishwashing machine actions objects tools 6/25/2015 50 actions objects tools learn learning actions learning objects learning tools use recognizing actions recognizing objects using tools 6/25/2015 51 Ranked 2nd at Microsoft Kinect Demonstration Competition, CVPR 2012 Providence, Rhode Island 6/25/2015 52 object recognition self-supervised strategies kinematics motion Human Robot Interaction is a new and natural application for visual recognition In robotics settings strong cues are often available, therefore object detectors can be avoided Recognition as tool for complex tasks: grasp, manipulation, affordances, pose S.R. Fanello, C. Ciliberto, L. Natale, G. Metta – Weakly Supervised Strategies for Natural Object Recognition in Robotics, ICRA 2013 C. Ciliberto, S.R. Fanello, M. Santoro, L. Natale, G. Metta, L. Rosasco - On the Impact of Learning Hierarchical Representations for Visual Recognition in Robotics, IROS 2013 6/25/2015 53 6/25/2015 54 dataset 6/25/2015 55 iCubWorld dataset (2.0) • Growing dataset collecting images from a real robotic setting • Provide the community with a tool for benchmarking visual recognition systems in robotics • 28 Objects, 7 categories, 4 sessions of acquisition (four different days) • 11Hz acquisition frequency. • ~50K Images http://www.iit.it/en/projects/data-sets.html 6/25/2015 56 methods 6/25/2015 57 methods Sparse Coding [yang et al. ’09] Learned on iCubWorld Overfeat [sermanet et al. ’14] Learned on Imagenet 6/25/2015 58 some questions • Scalability. How do iCub recognition capability decrease as we add more objects to distinguish? • Can we use assumptions on physical continuity to make recognition more stable? • Incremental Learning. How does learning during multiple sessions affect the system recognition skills? • Generalization. How well does the system recognize objects “seen” under different settings? 6/25/2015 59 first results We started by addressing instance recognition 6/25/2015 60 performance wrt # of objects 6/25/2015 61 exploiting continuity in time 6/25/2015 62 incremental learning Cumulative learning on the 4 days of acquisition. Tested on: • Present: test on current day 1 • Causal: test on current and 0.95 past days 0.9 0.85 • Future: test on future days 0.8 (current not included) OF present 0.75 OF causal • Independent: train & test on 0.7 OF future OF Independent current day only 0.65 1 2 3 4 accuracy (avg. over classes) day 6/25/2015 63 what about a week? 1 0.95 0.9 0.85 ? 0.8 OF present 0.75 OF causal 0.7 OF future OF Independent 0.65 1 2 3 4 5 6 7 accuracy (avg.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    101 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us