Robot Systems Integration

Total Page:16

File Type:pdf, Size:1020Kb

Robot Systems Integration Transformative Research and Robotics Kazuhiro Kosuge Distinguished Professor Department of Robotics Tohoku University 2020 IEEE Vice President-elect for Technical Activities IEEE Fellow, JSME Fellow, SICE Fellow, RSJ Fellow, JSAE Fellow My brief history • March 1978 Bachelor of Engineering, Department of Control Engineering, Tokyo Institute of Technology • Marcy 1980 Master of Engineering, Department of Control Engineering, Tokyo Institute of Technology • April 1980 Research staff, Department of Production Engineering Nippondenso (Denso) Corporation • October 1982 Research Associate, Tokyo Institute of Technology • July 1988 Dr. of Engineering, Tokyo Institute of Technology • September 1989 - August 1990 Visiting Research Scientist, Department of Mechanical Engineering, Massachusetts Institute of Technology • September 1990 Associate Professor, Faculty of Engineering, Nagoya University • March 1995 Professor, School of Engineering, Tohoku University • April 1997 Professor, Graduate School of Engineering, Tohoku University • December 2018 Tohoku University Distinguished Professor 略 歴 • 1978年3月 東京工業大学工学部制御工学科卒業 • 1980年3月 東京工業大学大学院理工学研究科修士課程修了(制御工学専攻,工学修士) • 1980年4月 日本電装株式会社(現 株式会社デンソー) • 1982年10月 東京工業大学工学部制御工学科助手(工学部) • 1988年7月 東京工業大学大学院理工学研究科 工学博士(制御工学専攻) • 1989年9月-1990年8月 米国マサチューセッツ工科大学機械工学科客員研究員 (Visiting Research Scientist, Department of Mechanical Engineering, Massachusetts Institute of Technology) • 1990年 9月 名古屋大学 助教授(工学部) • 1995年 3月 東北大学 教授(工学部) • 1997年 4月 東北大学 教授(工学研究科)大学院重点化による配置換 • 2018年12月 東北大学 Distinguished Professor My another history • Select Fellow, Center for Research and Development Strategy, Japan Science and Technology Agency, FY2005 - FY2011 • Review Board Member, PE7, ERC Advance Grant, 2008, 2010, 2012, 2014, 2019 • Senior Program Officer JSPA • Senior Program Office, Research Center for Science Systems, Japan Society of Promotion of Science, FY2007 - FY2009 • Science Officer, Research Promotion Bureau, Ministry of Education, Culture, Sports, Science and Technology, FY2010 - FY2013 • President, IEEE Robotics and Automation Society, FY2010 - FY2011 • Director & Delegate, Division X, IEEE Board of Directors, FY2015 - FY 2016 • 2020 IEEE Vice President for Technical Activities,FY2020 もう一つの略歴 • 科学技術振興機構, 研究開発戦略センター 特任フェロー,FY2005 - FY2011 • Review Board Member, PE7, FP7,ERC Advance Grant, 2008, 2010, 2012, 2014 • 日本学術振興会 学術システム研究センター 主任研究員, FY2007 - FY2009 • 文部科学省 研究振興局 科学官 FY2010 - FY2013 • President, IEEE Robotics and Automation Society, FY2010 - FY2011 • Director & Delegate, Division X, IEEE Board of Directors, FY2015 - FY 2016 – Member, IEEE Public Visibility Committee, FY2015 - FY2016 – Member, IEEE TAB Nominations and Appointments Committee, FY2015 - FY2016 – Member, IEEE Ad Hoc Committee on StrateGic PlanninG, FY2015, FY2016 • 2020 IEEE Vice President for Technical Activities,FY2020 Outline • Some of my research in robotics • Robot Systems Integration • Physical Human-Robot Interaction – Human robot collaboration through interaction – Co-worker robot • Universal Manipulation – Issues and visual servoing for program-free robot • Conclusions Impedance Controller Design Based on Virtual Internal Model Cooperation of Humans for Handling an Object Coordination of dualCoordination arms of both arms Coordination of Manipulators Single-Master Multi-Slaves System (1989) K. Kosuge, J. Ishikawa, K. Furuta, M. Sakai, “Control of Single-Master Multi-Slave Manipulator Using VIM,” Proceedings of the 1990 IEEE International Conference on Robotics and Automation, 1990, 1172-1177. Coordination of Manipulators Single-Master Multi-Slaves System (1989) K. Kosuge, J. Ishikawa, K. Furuta, M. Sakai, “Control of Single-Master Multi-Slave Manipulator Using VIM,” Proceedings of the 1990 IEEE International Conference on Robotics and Automation, 1990, 1172-1177. Assembly of Two Parts (1994) K. Kosuge, H. Yoshida, T. Fukuda, Masaru Sakai, K. Kanitani, K. Hariki, ”Unified Control for Dynamic Cooperative Manipulation”, Proceedings of the 1994 IEEE/RSJ International Workshop on Intelligent Robotics and Systems, 1994, 1042-1047. Assembly of Two Parts (1994) K. Kosuge, H. Yoshida, T. Fukuda, Masaru Sakai, K. Kanitani, K. Hariki, ”Unified Control for Dynamic Cooperative Manipulation”, Proceedings of the 1994 IEEE/RSJ International Workshop on Intelligent Robotics and Systems, 1994, 1042-1047. Bilateral Feedback of Master-slave Manipulator System Ordinary bilateral feedback Passivity-based realization of bilateral feedback Segment Assembly System (1996) Segment Assembly System for Tunnel Shield Machine K. Kosuge, K. Takeo, D. Taguchi, T. Fukuda, H. Murakami, “Task-Oriented Force Control of Parallel Link Robot for the Assembly of Segments of a Shield Tunnel Excavation System,” IEEE/ASME Transactions on Mechatronics, 1 (3), (1996), 250-258. Parts-mating Theory (2001) K. Kosuge, M. Shimizu, “Planar Parts-mating Using Structured Compliance,” Proceedings of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, (2001), 1477-1482. M. Shimizu, K. Kosuge, “An Admittance Design Method for General Spatial Parts Mating,” Proceedings of the 2004 IEEE International Conference on Robotics and Automation, (2004), 3571-3576. Robot System for Dish Washing Machine (2009.3.) K. Kosuge, Y. Hirata, J. Lee, A. Kawamura, K. Hashimoto, S. Kagami, Y. Hayashi, N. Yokoshima, H. Miyazawa, R. Teranaka, Y. Natsuizaka, K. Sakai, “Development of an Automatic Dishwashing Robot System,” Proceedings of the 2009 International Conference on Mechatronics and Automation, (2009), 43-48. Cooperation of Humans for Handling an Object Coordination of multiple humans Multiple Mobile Manipulator Coordination (2001) Y. Kume, Y. Hirata, Z. D. Wang, K. Kosuge, ”Decentralized Control of Multiple Mobile Manipulators Handling a Single Object in Coordination”, Proceedings of the 2002 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2002, 2758-2763. Y. Hirata, Y. Kume, Z. D. Wang, K. Kosuge, ”Decentralized Control of Multiple Mobile Manipulators Based on Virtual 3-D Caster Motion for Handling an Object in Cooperation with a Human”, Proceedings of the 2003 IEEE International Conference on Robotics and Automation, 2003, 938-943. Cooperation of Mobile Dual Manipulators (2003) Y. Hirata, Y. Kume, T. Sawada, Z. D. Wang, K. Kosuge, ”Handling of an Object by Multiple Mobile Manipulators in Coordination based on Caster-like Dynamics”, Proceedings of the 2004 IEEE International Conference on Robotics and Automation, 2004, 807-812. Mechanical Parking Systems Elevator Parking Systems Convey Parking Systems Shuttle Parking Systems Mechanical Parking Systems Users are required to position their cars in a narrow space. Mechanical Parking Systems • A parking system is required to have a caretaker. • Each driver is required to park his/her car in a narrow space precisely, which is not easy for a novice driver. iCART Concept Intelligent Cooperative Autonomous Robot Transporters iCART (intelligent Cooperative Autonomous Robot Transporters) M. Endo, K. Hirose, Y. Hirata, K. Kosuge, T. Kanbayashi, M. Oomoto, K. Akune, H. Arai, H. Shinoduka, K. Suzuki, “A Car Transportation System by Multiple Mobile Robots -iCART-”, Proceedings of 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2008, 2795-2801. Demonstration of iCARTII Concept Koshi Kashiwazaki, Kazuhiro Kosuge, Yasuhisa Hirata, Yusuke Sugahara, Takashi Kanbayashi, Koki Suzuki, Kazunori Murakami and Kenichi Nakamura, “Cooperative Transportation Control in Consideration of not only Internal Force but also External Force Applied to “MRWheel,” Proceedings of the 2012 IEEE International Conference on Robotics and Biomimetics, (2012), 1867-1873. in Germany https://www.youtube.com/watch?v=Gnypt72F20Q Outline • Some of my research in robotics • Robot Systems Integration • Physical Human-Robot Interaction – Human robot collaboration through interaction – Co-worker robot • Universal Manipulation – Issues and visual servoing for program-free robot • Conclusions Robotics and Societal Values • Societal Level Societal Values • Service Level – Service enablers Services • Fundamental Technologies Level Foundations CRDS, JST, 2009, Modified by Kosuge, August, 2011 Societal Values • For Individuals ★Quality of Life • For Communities ★Industrial Competitiveness – For Families – For Industries – For Local Government – For Nations • For the Globe ★Global Issues CRDS, JST, 2009, Modified by Kosuge, August, 2011 Challenges and Opportunities in Robotics Social Value Global Level Community Level Quality of Life •Government Robotics oriented Service/Application •Environmental Monitoring •Utilities •Medicine •Security Service •Natural Resources Exploration •Retailer/Wholesaler •Therapy •Mobility •Agriculture •Transportation •Daily Life Assist •Shopping and Development •Forestry •Communication •Healthcare •Hobby •Space Exploration •Fishery Services •Service Industries •Rehabilitation •Entertainment •Mining •Deep Undersea and Underground •Medicine •Mental care •Sports •Manufacturing Exploration •Education •Learning •Comfort Life •Construction •Anti-terrorism ・Rescue Operation •Research and •Child care •Watch •Wastes Treatment/ Development •HouseKeeping •Communication •Prevention of Infectious Diseases Management •Cyborg (Cybernetic organism) •Software framework - •Stochasticity in Robotics •Social Concerns Emerging •Performance evaluation and Benchmarking •Functional Safety •Ambient intelligence •Nano-micro Robotics Technology •Autonomous Robots •Human Modeling Foundations Robotics •Teleoperation •Wearable Technology
Recommended publications
  • Mobile Robot Kinematics
    Mobile Robot Kinematics We're going to start talking about our mobile robots now. There robots differ from our arms in 2 ways: They have sensors, and they can move themselves around. Because their movement is so different from the arms, we will need to talk about a new style of kinematics: Differential Drive. 1. Differential Drive is how many mobile wheeled robots locomote. 2. Differential Drive robot typically have two powered wheels, one on each side of the robot. Sometimes there are other passive wheels that keep the robot from tipping over. 3. When both wheels turn at the same speed in the same direction, the robot moves straight in that direction. 4. When one wheel turns faster than the other, the robot turns in an arc toward the slower wheel. 5. When the wheels turn in opposite directions, the robot turns in place. 6. We can formally describe the robot behavior as follows: (a) If the robot is moving in a curve, there is a center of that curve at that moment, known as the Instantaneous Center of Curvature (or ICC). We talk about the instantaneous center, because we'll analyze this at each instant- the curve may, and probably will, change in the next moment. (b) If r is the radius of the curve (measured to the middle of the robot) and l is the distance between the wheels, then the rate of rotation (!) around the ICC is related to the velocity of the wheels by: l !(r + ) = v 2 r l !(r − ) = v 2 l Why? The angular velocity is defined as the positional velocity divided by the radius: dθ V = dt r 1 This should make some intuitive sense: the farther you are from the center of rotation, the faster you need to move to get the same angular velocity.
    [Show full text]
  • 6D Image-Based Visual Servoing for Robot Manipulators with Uncalibrated Stereo Cameras
    6D Image-based Visual Servoing for Robot Manipulators with uncalibrated Stereo Cameras Caixia Cai1, Emmanuel Dean-Leon´ 2, Nikhil Somani1, Alois Knoll1 Abstract— This paper introduces 6 new image features to A. Related work provide a solution to the open problem of uncalibrated 6D image-based visual servoing for robot manipulators, where An IBVS usually employs the image Jacobian matrix the goal is to control the 3D position and orientation of the (Jimg) to relate end-effector velocities in the manipulator’s robot end-effector using visual feedback. One of the main Task space to the feature parameter velocities in the feature contributions of this article is a novel stereo camera model which employs virtual orthogonal cameras to map 6D Cartesian (image) space. A full and comprehensive survey on Visual poses defined in the Task space to 6D visual poses defined Servoing and image Jacobian definitions can be found in [1], in a Virtual Visual space (Image space). This new model is [3], [4] and more recently in [5]. In general, the classical used to compute a full-rank square Image Jacobian matrix image Jacobian is defined using a set of image feature (Jimg), which solves several common problems exhibited by the measurements (usually denoted by s) and it describes how classical image Jacobians, e.g., Image space singularities and local minima. This Jacobian is a fundamental key for the image- image features change when the robot manipulator pose based controller design, where a chattering-free adaptive second changess ˙ = Jimgv. In Visual Servoing the image Jacobian order sliding mode is employed to track 6D visual motions for needs to be calculated or estimated.
    [Show full text]
  • Abbreviations and Glossary
    Appendix A Abbreviations and Glossary Abbreviations are defined and the mathematical symbols and notations used in this book are specified. Furthermore, the random number generator used in this book is referenced. A.1 Abbreviations arccos arccosine BiRRT Bidirectional rapidly growing random tree C-space Configuration space DH Denavit-Hartenberg DLR German aerospace center DOF Degree of freedom FFT Fast fourier transformation IK Inverse kinematics HRI Human-Robot interface LWR Light weight robot MMI Institute of Man-Machine interaction PCA Principal component analysis PRM Probabilistic road map RRT Rapidly growing random tree rulaCapMap Rula-restricted capability map RULA Rapid upper limb assessment SFE Shape fit error TCP Tool center point OV workspace overlap 130 A Abbreviations and Glossary A.2 Mathematical Symbols C configuration space K(q) direct kinematics H set of all homogeneous matrices WR reachable workspace WD dexterous workspace WV versatile workspace F(R,x) function that maps to a homogeneous matrix VRobot voxel space for the robot arm VHuman voxel space for the human arm P set of points on the sphere Np set of point indices for the points on the sphere No set of orientation indices OS set of all homogeneous frames distributed on a sphere MS capability map A.3 Mathematical Notations a scalar value a vector aT vector transposed A matrix AT matrix transposed < a,b > inner product 3 SO(3) group of rotation matrices ∈ IR SO(3) := R ∈ IR 3×3| RRT = I,detR =+1 SE(3) IR 3 × SO(3) A TB reference frame B given in coordinates of reference frame A a ceiling function a floor function A.4 Random Sampling In this book, the drawing of random samples is often used.
    [Show full text]
  • Pipeline Following by Visual Servoing for Autonomous Underwater Vehicles
    Pipeline following by visual servoing for Autonomous Underwater Vehicles Guillaume Alliberta,d, Minh-Duc Huaa, Szymon Krup´ınskib, Tarek Hamela,c aUniversity of Cˆoted’Azur, CNRS, I3S, France. Emails: allibert(thamel; hua)@i3s:unice: f r bCybernetix, Marseille, France. Email: szymon:krupinski@cybernetix: f r cInstitut Universitaire de France, France dCorresponding author Abstract A nonlinear image-based visual servo control approach for pipeline following of fully-actuated Autonomous Underwater Vehicles (AUV) is proposed. It makes use of the binormalized Plucker¨ coordinates of the pipeline borders detected in the image plane as feedback information while the system dynamics are exploited in a cascade manner in the control design. Unlike conventional solutions that consider only the system kinematics, the proposed control scheme accounts for the full system dynamics in order to obtain an enlarged provable stability domain. Control robustness with respect to model uncertainties and external disturbances is re- inforced using integral corrections. Robustness and efficiency of the proposed approach are illustrated via both realistic simulations and experimental results on a real AUV. Keywords: AUV, pipeline following, visual servoing, nonlinear control 1. Introduction control Repoulias and Papadopoulos (2007); Aguiar and Pas- coal (2007); Antonelli (2007) and Lyapunov model-based con- Underwater pipelines are widely used for transportation of trol Refsnes et al. (2008); Smallwood and Whitcomb (2004) oil, gas or other fluids from production sites to distribution sites. mostly concern the pre-programmed trajectory tracking prob- Laid down on the ocean floor, they are often subject to extreme lem with little regard to the local topography of the environ- conditions (temperature, pressure, humidity, sea current, vibra- ment.
    [Show full text]
  • Arxiv:1902.05947V1 [Cs.CV] 18 Feb 2019
    DIViS: Domain Invariant Visual Servoing for Collision-Free Goal Reaching Fereshteh Sadeghi University of Washington Servo Goal: semantic category Real Robot Test Mobile Robot Platforms at test time Servo Goal: image crop H Images N . #) &"'$ . Collision Net N ( . (FCN) #$ #$ . #%'$ . Conv . Conv #% &" N LSTM #% N Stack ! Train in Simulation Goal Semantic Net " (FCN) Time Figure 1: Domain Invariant Visual Servoing (DIViS) learns collision-free goal reaching entirely in simulation using dense multi-step rollouts and a recurrent fully convolutional neural network (bottom). DIViS can directly be deployed on real physical robots with RGB cameras for servoing to visually indicated goals as well as semantic object categories (top). Abstract Robots should understand both semantics and physics to be functional in the real world. While robot platforms provide means for interacting with the physical world they cannot autonomously acquire object-level semantics with- (a) (b) (c) out needing human. In this paper, we investigate how to Figure 2: (a) The classic 1995 visual servoing robot [46, 15]. minimize human effort and intervention to teach robots per- The image at final position (b) was given as the goal and the robot form real world tasks that incorporate semantics. We study was started from an initial view of (c). this question in the context of visual servoing of mobile robots and propose DIViS, a Domain Invariant policy learn- 1. Introduction ing approach for collision free Visual Servoing. DIViS in- corporates high level semantics from previously collected Perception and mobility are the two key capabilities that static human-labeled datasets and learns collision free ser- enable animals and human to perform complex tasks such voing entirely in simulation and without any real robot data.
    [Show full text]
  • Nyku: a Social Robot for Children with Autism Spectrum Disorders
    University of Denver Digital Commons @ DU Electronic Theses and Dissertations Graduate Studies 2020 Nyku: A Social Robot for Children With Autism Spectrum Disorders Dan Stephan Stoianovici University of Denver Follow this and additional works at: https://digitalcommons.du.edu/etd Part of the Disability Studies Commons, Electrical and Computer Engineering Commons, and the Robotics Commons Recommended Citation Stoianovici, Dan Stephan, "Nyku: A Social Robot for Children With Autism Spectrum Disorders" (2020). Electronic Theses and Dissertations. 1843. https://digitalcommons.du.edu/etd/1843 This Thesis is brought to you for free and open access by the Graduate Studies at Digital Commons @ DU. It has been accepted for inclusion in Electronic Theses and Dissertations by an authorized administrator of Digital Commons @ DU. For more information, please contact [email protected],[email protected]. Nyku : A Social Robot for Children with Autism Spectrum Disorders A Thesis Presented to the Faculty of the Daniel Felix Ritchie School of Engineering and Computer Science University of Denver In Partial Fulfillment of the Requirements for the Degree Master of Science by Dan Stoianovici August 2020 Advisor: Dr. Mohammad H. Mahoor c Copyright by Dan Stoianovici 2020 All Rights Reserved Author: Dan Stoianovici Title: Nyku: A Social Robot for Children with Autism Spectrum Disorders Advisor: Dr. Mohammad H. Mahoor Degree Date: August 2020 Abstract The continued growth of Autism Spectrum Disorders (ASD) around the world has spurred a growth in new therapeutic methods to increase the positive outcomes of an ASD diagnosis. It has been agreed that the early detection and intervention of ASD disorders leads to greatly increased positive outcomes for individuals living with the disorders.
    [Show full text]
  • A Review of Parallel Processing Approaches to Robot Kinematics and Jacobian
    Technical Report 10/97, University of Karlsruhe, Computer Science Department, ISSN 1432-7864 A Review of Parallel Processing Approaches to Robot Kinematics and Jacobian Dominik HENRICH, Joachim KARL und Heinz WÖRN Institute for Real-Time Computer Systems and Robotics University of Karlsruhe, D-76128 Karlsruhe, Germany e-mail: [email protected] Abstract Due to continuously increasing demands in the area of advanced robot control, it became necessary to speed up the computation. One way to reduce the computation time is to distribute the computation onto several processing units. In this survey we present different approaches to parallel computation of robot kinematics and Jacobian. Thereby, we discuss both the forward and the reverse problem. We introduce a classification scheme and classify the references by this scheme. Keywords: parallel processing, Jacobian, robot kinematics, robot control. 1 Introduction Due to continuously increasing demands in the area of advanced robot control, it became necessary to speed up the computation. Since it should be possible to control the motion of a robot manipulator in real-time, it is necessary to reduce the computation time to less than the cycle rate of the control loop. One way to reduce the computation time is to distribute the computation over several processing units. There are other overviews and reviews on parallel processing approaches to robotic problems. Earlier overviews include [Lee89] and [Graham89]. Lee takes a closer look at parallel approaches in [Lee91]. He tries to find common features in the different problems of kinematics, dynamics and Jacobian computation. The latest summary is from Zomaya et al. [Zomaya96].
    [Show full text]
  • Acknowledgements Acknowl
    2161 Acknowledgements Acknowl. B.21 Actuators for Soft Robotics F.58 Robotics in Hazardous Applications by Alin Albu-Schäffer, Antonio Bicchi by James Trevelyan, William Hamel, The authors of this chapter have used liberally of Sung-Chul Kang work done by a group of collaborators involved James Trevelyan acknowledges Surya Singh for de- in the EU projects PHRIENDS, VIACTORS, and tailed suggestions on the original draft, and would also SAPHARI. We want to particularly thank Etienne Bur- like to thank the many unnamed mine clearance experts det, Federico Carpi, Manuel Catalano, Manolo Gara- who have provided guidance and comments over many bini, Giorgio Grioli, Sami Haddadin, Dominic Lacatos, years, as well as Prof. S. Hirose, Scanjack, Way In- Can zparpucu, Florian Petit, Joshua Schultz, Nikos dustry, Japan Atomic Energy Agency, and Total Marine Tsagarakis, Bram Vanderborght, and Sebastian Wolf for Systems for providing photographs. their substantial contributions to this chapter and the William R. Hamel would like to acknowledge work behind it. the US Department of Energy’s Robotics Crosscut- ting Program and all of his colleagues at the na- C.29 Inertial Sensing, GPS and Odometry tional laboratories and universities for many years by Gregory Dudek, Michael Jenkin of dealing with remote hazardous operations, and all We would like to thank Sarah Jenkin for her help with of his collaborators at the Field Robotics Center at the figures. Carnegie Mellon University, particularly James Os- born, who were pivotal in developing ideas for future D.36 Motion for Manipulation Tasks telerobots. by James Kuffner, Jing Xiao Sungchul Kang acknowledges Changhyun Cho, We acknowledge the contribution that the authors of the Woosub Lee, Dongsuk Ryu at KIST (Korean Institute first edition made to this chapter revision, particularly for Science and Technology), Korea for their provid- Sect.
    [Show full text]
  • Forward and Inverse Kinematics Analysis of Denso Robot
    Proceedings of the International Symposium of Mechanism and Machine Science, 2017 AzC IFToMM – Azerbaijan Technical University 11-14 September 2017, Baku, Azerbaijan Forward and Inverse Kinematics Analysis of Denso Robot Mehmet Erkan KÜTÜK 1*, Memik Taylan DAŞ2, Lale Canan DÜLGER1 1*Mechanical Engineering Department, University of Gaziantep Gaziantep/ Turkey E-mail: [email protected] 2 Mechanical Engineering Department, University of Kırıkkale Abstract used Robotic Toolbox in forward kinematics analysis of A forward and inverse kinematic analysis of 6 axis an industrial robot [4]. DENSO robot with closed form solution is performed in This study includes kinematics of robot arm which is this paper. Robotics toolbox provides a great simplicity to available Gaziantep University, Mechanical Engineering us dealing with kinematics of robots with the ready Department, Mechatronics Lab. Forward and Inverse functions on it. However, making calculations in kinematics analysis are performed. Robotics Toolbox is traditional way is important to dominate the kinematics also applied to model Denso robot system. A GUI is built which is one of the main topics of robotics. Robotic for practical use of robotic system. toolbox in Matlab® is used to model Denso robot system. GUI studies including Robotic Toolbox are given with 2. Robot Arm Kinematics simulation examples. Keywords: Robot Kinematics, Simulation, Denso The robot kinematics can be categorized into two Robot, Robotic Toolbox, GUI main parts; forward and inverse kinematics. Forward kinematics problem is not difficult to perform and there is no complexity in deriving the equations in contrast to the 1. Introduction inverse kinematics. Especially nonlinear equations make the inverse kinematics problem complex.
    [Show full text]
  • Real-Time Vision, Tracking and Control
    Proceedings of the 2000 IEEE International Conference on Robotics & Automation San Francisco, CA April 2000 Real-Time Vision, Tracking and Control Peter I. Corke Seth A. Hutchinson CSIRO Manufacturing Science & Technology Beckman Institute for Advanced Technology Pinjarra Hills University of Illinois at Urbana-Champaign AUSTRALIA 4069. Urbana, Illinois, USA 61801 [email protected] [email protected] Abstract sidered the fusion of computer vision, robotics and This paper, which serves as an introduction to the control and has been a distinct field for over 10 years, mini-symposium on Real- Time Vision, Tracking and though the earliest work dates back close to 20 years. Control, provides a broad sketch of visual servoing, the Over this period several major, and well understood, approaches have evolved and been demonstrated in application of real-time vision, tracking and control many laboratories around the world. Fairly compre- for robot guidance. It outlines the basic theoretical approaches to the problem, describes a typical archi- hensive overviews of the basic approaches, current ap- tecture, and discusses major milestones, applications plications, and open research issues can be found in a and the significant vision sub-problems that must be number of recent sources, including [l-41. solved. The next section, Section 2, describes three basic ap- proaches to visual servoing. Section 3 provides a ‘walk 1 Introduction around’ the main functional blocks in a typical visual Visual servoing is a maturing approach to the control servoing system. Some major milestones and proposed applications are discussed in Section 4. Section 5 then of robots in which tasks are defined visually, rather expands on the various vision sub-problems that must than in terms of previously taught Cartesian coordi- be solved for the different approaches to visual servo- nates.
    [Show full text]
  • A Self-Triggered Position Based Visual Servoing Model Predictive Control Scheme for Underwater Robotic Vehicles †
    machines Article A Self-triggered Position Based Visual Servoing Model Predictive Control Scheme for Underwater Robotic Vehicles † Shahab Heshmati-alamdari 1 , Alina Eqtami 2, George C. Karras 3,4 , Dimos V. Dimarogonas 1 and Kostas J. Kyriakopoulos 4,* 1 Division of Decision and Control Systems, School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, Stockholm, Sweden; [email protected] (S.H.-a.); [email protected] (D.V.D.) 2 Laboratoire des Signaux et Systémes (L2S) CNRS, CentraleSupélec, Université Paris-Sud, Université Paris-Saclay 3, rue Joliot-Curie, 91192 Gif-sur-Yvette, cedex, France; [email protected] 3 Dept. of Computer Science and Telecommunications, University of Thessaly, 3rd Km Old National Road Lamia-Athens, 35100, Lamia, Greece; [email protected] 4 Control Systems Laboratory, School of Mechanical Engineering, National Technical University of Athens, 15780 Athens, Greece * Correspondence: [email protected] † This paper is an extended version of our paper published in Shahab Heshmati-Alamdari, Alina Eqtami, George C. Karras, Dimos V. Dimarogonas, and Kostas J. Kyriakopoulos. A Self-triggered Visual Servoing Model Predictive Control Scheme for Under-actuated Underwater Robotic Vehicles. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 31 May–7 June 2014. Received: 1 May 2020; Accepted: 7 June 2020; Published: 11 June 2020 Abstract: An efficient position based visual sevroing control approach for Autonomous Underwater Vehicles (AUVs) by employing Non-linear Model Predictive Control (N-MPC) is designed and presented in this work. In the proposed scheme, a mechanism is incorporated within the vision-based controller that determines when the Visual Tracking Algorithm (VTA) should be activated and new control inputs should be calculated.
    [Show full text]
  • IEEE Transactions on Robotics (T-RO) Editorial Board Listing
    IEEE Transactions on Robotics (T-RO) Editorial Board Listing Status by February 1, 2006 Senior Editors Dr. Hirohiko Arai, Editor Intelligent Systems Institute National Institute of Advanced Industrial Science & Technology AIST Tsukuba East 1-2-1 Namiki Tsukuba, Ibaraki 305-8564 Japan P: +81 298 61 7088 F: +81-298 61 7201 E: [email protected] Primary Areas: Robot Control, Manipulators, Dynamics, Kinematics, Force/Impedance/Compliance, Underactuated Mechanisms, Nonholonomic Systems Secondary Areas: Human-Robot Cooperation, Humanoids, Walking Robots, Telerobotics, Industrial Robots ********** Professor George A. Bekey, Founding Editor Department of Computer Science University of Southern California Los Angeles, CA 90089-0782 USA P: +1 213 740 4501 F: +1 213 740 7512 E: [email protected] Primary Areas: Mobile Robots, Autonomy, Walking Machines, Robot Intelligence, Multi-Robot Cooperation Secondary Areas: Medical Robotics and Personal Robots ********** Professor Alessandro De Luca, Editor-in-Chief Dipartimento di Informatica e Sistemistica Universita di Roma “La Sapienza” Via Eudossiana 18 00184 Roma Italy P: +39 06 44585 371 F: +39 06 44585 367 E: [email protected] U: http://www.dis.uniroma1.it/labrob Primary Areas: Learning, Force/Impedance/Impact Control, Flexible Manipulators, Nonholonomic Robots, Nonlinear Control, Redundant Manipulators, Underactuated Mechanisms Secondary Areas: Kinematics, Dynamics, Mobile Robots, Motion/Path Planning, Obstacle Avoidance, Space Robots ********** Professor Peter B. Luh, Editor Emeritus Department
    [Show full text]