A Robot Modeled After the Praying Mantis
Total Page:16
File Type:pdf, Size:1020Kb
THE DEVELOPMENT OF A VISUAL SYSTEM FOR MANTISBOT: A ROBOT MODELED AFTER THE PRAYING MANTIS by ANDREW PAUL GETSY Submitted in partial fulfillment of the requirements for the degree of Master of Science Department of Mechanical and Aerospace Engineering CASE WESTERN RESERVE UNIVERSITY August 2016 CASE WESTERN RESERVE UNIVERSITY SCHOOL OF GRADUATE STUDIES We hereby approve the dissertation of Andrew Paul Getsy candidate for the degree of Master of Science*. Committee Chair Dr. Roger D. Quinn Committee Member Dr. Roy E. Ritzmann Committee Member Dr. Clare Rimnac Date of Defense July 5, 2016 *We also certify that written approval has been obtained for any proprietary material contained therein. Contents List of Tables iii List of Figures iv Acknowledgments ix Abstract x 1 Introduction 1 2 Background 4 2.1 Biologically Inspired Robots as Research Platforms . .4 2.2 An Overview of Mantis Vision as it relates to Hunting . .7 2.2.1 Prey Fixation . .8 2.2.2 Prey Size and Speed . .9 2.2.3 Prey Distance . 10 2.2.4 Striking at Prey . 11 3 Additions to the MantisBot Electrical System 13 3.1 Strain Gage Circuit Design . 13 3.2 Vision System Electrical Design . 17 4 MantisBot Mechanical Design Changes for the Vision System 24 i CONTENTS 4.1 Body Segment Design . 24 4.2 Added Components . 29 4.3 Vision System Mechanical Design . 33 5 Vision System Calibration Testing 36 6 MantisBot Control Scheme 42 7 MantisBot Vision System Testing 44 8 Conclusion and Future Work 51 8.1 Conclusion . 51 8.2 Future Work . 51 A Mini-MantisBot: A Testbed for Potential Vision Systems 55 ii List of Tables 4.1 Relevant servo motor specifications for Dynamixel AX-12A, MX-64T, and MX-106T [ROBOTIS, 2010a], [ROBOTIS, 2010c], [ROBOTIS, 2010b]. 32 8.1 Comparison of important feature between the three discussed vision system (1 Pi Camera with 360 lens, 5 voltaic panels, and 1 pixy sensor). 53 iii List of Figures 2.1 Labeled Robots: Robot I, Robot II, and Robot III from Roger Quinn's research group. Photograph courtesy of biorobots.cwru.edu. Labeled Robot: HECTOR Robot from University of Bielefeld. Photograph courtesy of http://www.botmag.com/. .5 2.2 Top: Hermes robot (photograph taken from [Arkin et al., 2000]). Bot- tom: MantisBot robot. .7 2.3 Schematic drawing that shows peering movement used to determine the distance of stationary objects. This motion includes swaying the prothorax while keeping the head looking straight forward. 10 3.1 Photograph of MantisBot with labeled strain gages and mounted strain gage circuits. Note: Two strain gages are hidden behind the body located on the other mid- and hind-legs. 14 3.2 Top: Materials used during new strain gage circuit board testing. From left to right: Individual strain gage attached to test femur, connected to a strain gage circuit board, connected to an Arbotix-M micro- controller. Bottom: Old strain gage circuits built onto single piece of Perfboard. Photograph taken from [Chrzanowski, 2015]. 15 iv LIST OF FIGURES 3.3 Top: Old circuit diagram for strain gage readings. A combination of a Wheatstone bridge paired with a differential amplifier utilizing poten- tiometers for circuit tuning. Photograph taken from [Chrzanowski, 2015]. Bottom: New circuit diagram for strain gage readings. A combination of a Wheatstone bridge paired with a differential amplifier utilizing voltage followers for circuit stability. 16 3.4 Chosen micro-controller, Arbotix-M with labeled analog inputs. Pho- tograph courtesy of http://learn.trossenrobotics.com/. 18 3.5 3D rendering of vision system structure (Green) with mounted solar panels (Blue). 19 3.6 Top: Vision system circuit with labeled solar panel inputs, operational amplifiers, and LED circuit. Bottom: Labeled flow chart of circuit functionality. 20 3.7 Schematic layout of head sensor circuit (Part I). 22 3.8 Continued: Schematic layout of head sensor circuit (Part II). 23 4.1 Left: Praying Mantis with labeled prothorax and mesothorax body segments, and rearing and yawing body joint [Photograph courtesy of http: //www.rentokil.co.za/]. Right: 3D rendered MantisBot with labeled prothorax and mesothorax body segments, rearing motor, and yawing motor. 24 4.2 Left: Original Prothorax body segment of MantisBot. Right: New Prothorax body segment of MantisBot after the reduction in general length. 25 4.3 Left: Original rearing mechanism (L3). Right: New rearing mechanism (L4) that replaced L3. Listing general segment lengths. 26 v LIST OF FIGURES 4.4 Top: The torque of the rearing motor and the head's range of motion as a function of the angle of the rearing motor, assuming a constant worst-case load and the length of L4 to be 8.3 cm (3.27 in). Bottom: The torque of the rearing motor and the head's range of motion as a function of the angle of the rearing motor, assuming a constant worst- case load and the length of L4 to be 16.6 cm (6.54 in). 27 4.5 Top: Resting state of rearing mechanism with labeled members and member angles. Bottom: Maximum limit of rearing mechanism with labeled members and member angles. 28 4.6 Left: Labeled neck joint connected to head sensor and prothorax. Right: Neck joint set to maximum yawing angles of plus and minus 60 degrees. 30 4.7 Left: Back of head sensor revealing male-female connection to neck motor. Right: Back of head sensor when male-female connection has been made. 30 4.8 Left: Labeled joints (in Blue) and leg segments (in Black) of the insect [Chrzanowski, 2015]. Right: Bottom view of MantisBot with labeled motors that where replaced with a different model that had higher stall torque capabilities. 31 4.9 Test setup representation for the light sources angle effect on solar panel output. 33 4.10 Solar panel output for short axis trial, long axis trial, and their average. 34 5.1 Head sensor inside calibration fixture constructed from brazed steel rod. 37 5.2 Graph of digital sensor readings vs. time. Shows collected readings for each of the 25 test locations. Readings for Up-down (UD), Left-Right (LR), and center intensity (I) are included. 38 vi LIST OF FIGURES 5.3 Test fixture results. Mean sensor reading values from the head sensor encoding the elevation angle of the light. 39 5.4 Test fixture results. Mean sensor reading values from the head sensor encoding the azimuth angle of the light. 40 5.5 Test fixture results. Mean sensor reading values from the head sensor encoding the distance from the center to the light. 41 7.1 Video frames taken from rearing motor test trials showing MantisBot's ability to lift prothorax when tracking a light source. Each frame shows MantisBot with an increasing prothorax angle. 45 7.2 Graphs showing MantisBot's ability to actuate the Prothorax yawing joint saccadically in order to follow a horizontally sweeping stimulus. 46 7.3 Video frames taken from prothorax yawing motor test trials showing MantisBot's ability to track a light source within the horizontal plane. 47 7.4 Video frames taken from the tracking and pursuing test trials showing MantisBot's ability to saccadically track a light source with the use of its neck and prothorax and perform pursuing motions with one of its mid-legs. 48 7.5 Fig. 7.4 Continued: Video frames taken from the tracking and pursuing test trials showing MantisBot's ability to saccadically track a light source with the use of its neck and prothorax and perform pursuing motions with one of its mid-legs. 49 7.6 Fig. 7.4 Continued: Video frames taken from the tracking and pursuing test trials showing MantisBot's ability to saccadically track a light source with the use of its neck and prothorax and perform pursuing motions with one of its mid-legs. 50 A.1 Mini-MantisBot robot with labeled rudimentary abilities. 55 vii LIST OF FIGURES A.2 Mini-MantisBot vision system components; Raspberry Pi 2 model B processor, Pi camera, and Kogeto Dot 360 Lens. 57 viii Acknowledgments I want to recognize everyone who had a hand in making this project what it is today, for it was in no way a one man show. I want to thank Nicholas Szczecinski for introducing me to MantisBot and sharing with me his expansive knowledge and enthusiasm for this project. I truly would not have been able to accomplish what I have without his help. I want to thank my committee chair, Dr. Quinn, for his insight, patience, and trust. He allowed me the freedom to take control of this project which gave me great insight into my abilities and the world of research. His always open door was the best gift a Lab Director could give. I want to thank my other committee members. Dr. Ritzmann for his encour- agement and criticism. Helpful and truthful direction can be the difference between success and failure. And Dr. Rimnac for her helpful advice when writing this technical document and enthusiasm for my accomplishments. I have to thank George Daher, Circuits Lab Technician. The time he spent with me trouble shooting circuits and acquiring the correct electrical components meant a great deal. Finally, I have to thank my Family for their support. Especially my dad who inspires me every day with his determination, wisdom, and trust. I would not be where I am today without my Family. ix The Development of a Visual System for MantisBot: A Robot Modeled after the Praying Mantis Abstract by ANDREW PAUL GETSY This thesis presents work done to advance robotics through the use of biological models of praying mantis behavior.