Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665/5666 Intelligent Machines Design Laboratory Instructors: Dr
Total Page:16
File Type:pdf, Size:1020Kb
Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665/5666 Intelligent Machines Design Laboratory Instructors: Dr. A. Antonio Arroyo, Dr. Eric M. Schwartz TAs: Andy Gray, Nick Cox TABLE OF CONTENTS Abstract .......................................................................................................................................... 3 Executive Summary ...................................................................................................................... 3 Introduction ................................................................................................................................... 3 Integrated System ......................................................................................................................... 4 Mobile Platform ............................................................................................................................ 6 Actuation ........................................................................................................................................ 9 Sensors ......................................................................................................................................... 11 Behaviors ..................................................................................................................................... 18 Experimental Layout and Results ............................................................................................. 20 Documentation ............................................................................................................................ 22 Appendices ................................................................................................................................... 22 2 Abstract Computer vision has been used for many years to analyze images and get meaningful information that can be used. This information can be many things depending on the application. This project uses computer vison to determine the two dimensional location of an object with known dimensions in a room. Using the position it is possible to approximate two dimensional maps of rooms or other enclosed areas using inexpensive equipment. This goal is accomplished using a pair of robots named ZMAPS Red Bot and ZMAPS Blue Bot. The robots work together to accomplish this task. Executive Summary This paper descirbes the work I have put into a pair of robots over the past four months. The designed robots map out enclosed spaces using a camera on a servo driven pan and tilt mechanism on one robot and an object of known dimensions and color on the other robot. The robots use velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary encoders for velocity feedback. The object used is a 9 inch by 9 inch pink clinder. It is tracked using the camera on a pan and tilt mechanism. Using the pin hole camera model measurements of the position of the object are made relative to the camera. The angle of the measured distance is determined from the pan servo angle. The robot with the object drives around and an occupancy map is made by showing where the robot drives. During this process the assumption is made that anywhere the robot drives is open space. This allows the assumption to be made that all remaining locations must be occupied. This allows for maps within a 15 foot radius to be formed within 2.5% of true measurement. Introduction I have enjoyed path planning and mapping for some time. While this appreciation did not begin with robots recently I have found a great interest in using robots to accomplish the goal of planning and mapping. I also enjoy control of dynamic systems. One of the more complex and interesting problems being planning and mapping with autonomous vehicles. As time goes on the auto industry is making cars safer by adding many autonomous features. Such as lane detection, parking, and obstacle detection to apply or assist in applying brakes. Many of these features use computer vision and radar to determine objects position and the car maps a way to detect lanes, park, or avoid an obstacle. Outside of the auto industry the robotics community is growing rapidly. Many of these robots always need at least one thing, an ability to sense the world around them and make control decisions based on observations. I believe one of the most powerful and affordable ways to do this is through computer vision. My interest in computer vision sensing for control led me to come up with Zack’s Mapping Squad (ZMAPS). This project introduces the first members of the ZMAPS family Red Bot and Blue Bot. The ZMAPS robots work together using a camera on a pan and tilt mechanism and an object of known dimensions. Using these tools the ZMAPS robots build a map of their local environment starting with a small enclosed space. 3 This paper explains the overall ZMAPS network, the ZMAPS robots platforms, how the robots move, what the robots use for sensing, how the robots build maps, and some experiments used to tune the system. Integrated System The ZMAPS network is composed of a router, an Asus laptop, an ODROID U3 computer (Red Bot), an ODROID C1 computer (Blue Bot), and two Arduino Mega2560 microcontrollers (Mega). The computers run ROS in Ubuntu Linux. These computers are used for communication, image processing, object detection, object tracking, obstacle avoidance, and building two dimensional maps. The microcontrollers are used for reading the IR range finders, running the dual motor drivers to drive the motors, driving the camera servos, and reading the motor encoders. Red Bot uses a PS3 Eye Camera to observe Blue Bot, to get the distance to Blue Bot from Red Bot, and get the angle error between the centroid of an object on Blue Bot and the center of the camera frame. It uses a PID controller to send angle commands in a way its Mega can understand. It sends information to its Mega to update servo positions for keeping Blue Bot in center frame. Feedback from the servos is given to determine the servos current angle. It streams this information off of and on to the network using a Panda wireless USB adapter. Blue Bot uses sensor information from its Mega for determining its position relative to objects. It responds by changing its velocity based on the distance to objects in an effort to avoid obstacles. It uses a PID controller to convert these velocity commands into integers its Mega can understand. The mega then transmits these commands to the motor drivers. Feedback on speed is given by the rotary encoders on the motors. It uses Panda wireless USB adapter for streaming information off of and on to the network. The laptop takes information from an Xbox controller to determine if the system should move to autonomous mode or not. If the system is in manual mode the Xbox controller is used to drive the robots around. The laptop also takes information from the robots to build occupancy maps based on the angle and distance measurements given by Red Bot. It also hosts the core for the network. Figure 1 shows a block diagram of all the devices in the network and how they will communicate or transfer power. 4 Figure 1. Block diagram of the ZMAPS network. The arrows show the flow of information or power. A double arrow shows information goes both ways. 5 Mobile Platform Red Bot and Blue Bot have very similar mobile platforms. Most of the bodies are made from balsa wood. The bases are circular discs of balsa wood. Each layer is separated by four inch tall pieces of red oak. This was used because it bonds easily to the balsa with adhesives and it was readily available. Two parallel rectangular holes were cut offset from the edge of the disc for the drive wheels. This was done so the outer diameters of the platforms are equal around the entire circumference and smooth making it less likely for the platform to become trapped on an edge. This especially helps Blue Bot if it gets close to an object. Arranging the wheels like this also allows for differential driving and steering of the platform. The wheels are ABS hubs with silicone treads for grip. The motors attach to the wheels by an M3 tapped aluminum coupler and M3 machine screws. The motors attach to the base of the platform by ninety degree aluminum brackets and M3 machine screws. The platform also has two plastic ball caster wheels to help with support and balance but allow for slip. The base also holds the three IR sensors for detecting the relative position of obstacles. These were mounted using more balsa wood that was bonded to the base using adhesives. The motor driver is attached to the base using M3 standoffs. The entire drive assembly was placed on the base layer to keep it together. Apart from the lipo battery which was put above one layer to make it easy to remove. The second layer holds the Mega, the power distribution, and the motor lipo battery. This layer is partially circular but has the front cut back slightly to allow for easier access to the bottom of the platform. The center of the layer has a large hole cut out to allow the wires from the base layer to easily pass through to the Mega, lipo battery, and power distribution board. The Mega, power distribution board, and the lipo battery are attached to the layer using Velcro and adhesives. The third layer holds the ODROID and the ODROID lipo battery. This layer is cut back very far to allow easy access to the second layer and the Mega. The ODROID and its battery are attached to the layer using Velcro and adhesives. The final layer of the two