Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665/5666 Intelligent Machines Design Laboratory Instructors: Dr. A. Antonio Arroyo, Dr. Eric M. Schwartz TAs: Andy Gray, Nick Cox

TABLE OF CONTENTS Abstract ...... 3

Executive Summary ...... 3

Introduction ...... 3

Integrated System ...... 4

Mobile Platform ...... 6

Actuation ...... 9

Sensors ...... 11

Behaviors ...... 18

Experimental Layout and Results ...... 20

Documentation ...... 22

Appendices ...... 22

2

Abstract Computer vision has been used for many years to analyze images and get meaningful information that can be used. This information can be many things depending on the application. This project uses computer vison to determine the two dimensional location of an object with known dimensions in a room. Using the position it is possible to approximate two dimensional maps of rooms or other enclosed areas using inexpensive equipment. This goal is accomplished using a pair of robots named ZMAPS Red Bot and ZMAPS Blue Bot. The robots work together to accomplish this task.

Executive Summary This paper descirbes the work I have put into a pair of robots over the past four months. The designed robots map out enclosed spaces using a camera on a servo driven pan and tilt mechanism on one robot and an object of known dimensions and color on the other robot. The robots use velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary encoders for velocity feedback. The object used is a 9 inch by 9 inch pink clinder. It is tracked using the camera on a pan and tilt mechanism. Using the pin hole camera model measurements of the position of the object are made relative to the camera. The angle of the measured distance is determined from the pan servo angle. The robot with the object drives around and an occupancy map is made by showing where the robot drives. During this process the assumption is made that anywhere the robot drives is open space. This allows the assumption to be made that all remaining locations must be occupied. This allows for maps within a 15 foot radius to be formed within 2.5% of true measurement.

Introduction I have enjoyed path planning and mapping for some time. While this appreciation did not begin with robots recently I have found a great interest in using robots to accomplish the goal of planning and mapping. I also enjoy control of dynamic systems. One of the more complex and interesting problems being planning and mapping with autonomous vehicles. As time goes on the auto industry is making cars safer by adding many autonomous features. Such as lane detection, parking, and obstacle detection to apply or assist in applying brakes. Many of these features use computer vision and radar to determine objects position and the car maps a way to detect lanes, park, or avoid an obstacle.

Outside of the auto industry the robotics community is growing rapidly. Many of these robots always need at least one thing, an ability to sense the world around them and make control decisions based on observations. I believe one of the most powerful and affordable ways to do this is through computer vision. My interest in computer vision sensing for control led me to come up with Zack’s Mapping Squad (ZMAPS).

This project introduces the first members of the ZMAPS family Red Bot and Blue Bot. The ZMAPS robots work together using a camera on a pan and tilt mechanism and an object of known dimensions. Using these tools the ZMAPS robots build a map of their local environment starting with a small enclosed space.

3

This paper explains the overall ZMAPS network, the ZMAPS robots platforms, how the robots move, what the robots use for sensing, how the robots build maps, and some experiments used to tune the system.

Integrated System The ZMAPS network is composed of a router, an Asus laptop, an ODROID U3 computer (Red Bot), an ODROID C1 computer (Blue Bot), and two Arduino Mega2560 microcontrollers (Mega). The computers run ROS in . These computers are used for communication, image processing, object detection, object tracking, obstacle avoidance, and building two dimensional maps. The microcontrollers are used for reading the IR range finders, running the dual motor drivers to drive the motors, driving the camera servos, and reading the motor encoders.

Red Bot uses a PS3 Eye Camera to observe Blue Bot, to get the distance to Blue Bot from Red Bot, and get the angle error between the centroid of an object on Blue Bot and the center of the camera frame. It uses a PID controller to send angle commands in a way its Mega can understand. It sends information to its Mega to update servo positions for keeping Blue Bot in center frame. Feedback from the servos is given to determine the servos current angle. It streams this information off of and on to the network using a Panda wireless USB adapter.

Blue Bot uses sensor information from its Mega for determining its position relative to objects. It responds by changing its velocity based on the distance to objects in an effort to avoid obstacles. It uses a PID controller to convert these velocity commands into integers its Mega can understand. The mega then transmits these commands to the motor drivers. Feedback on speed is given by the rotary encoders on the motors. It uses Panda wireless USB adapter for streaming information off of and on to the network.

The laptop takes information from an Xbox controller to determine if the system should move to autonomous mode or not. If the system is in manual mode the Xbox controller is used to drive the robots around. The laptop also takes information from the robots to build occupancy maps based on the angle and distance measurements given by Red Bot. It also hosts the core for the network. Figure 1 shows a block diagram of all the devices in the network and how they will communicate or transfer power.

4

Figure 1. Block diagram of the ZMAPS network. The arrows show the flow of information or power. A double arrow shows information goes both ways.

5

Mobile Platform Red Bot and Blue Bot have very similar mobile platforms. Most of the bodies are made from wood. The bases are circular discs of balsa wood. Each layer is separated by four inch tall pieces of red oak. This was used because it bonds easily to the balsa with adhesives and it was readily available. Two parallel rectangular holes were cut offset from the edge of the disc for the drive wheels. This was done so the outer diameters of the platforms are equal around the entire circumference and smooth making it less likely for the platform to become trapped on an edge. This especially helps Blue Bot if it gets close to an object.

Arranging the wheels like this also allows for differential driving and steering of the platform. The wheels are ABS hubs with silicone treads for . The motors attach to the wheels by an M3 tapped aluminum coupler and M3 machine screws. The motors attach to the base of the platform by ninety degree aluminum brackets and M3 machine screws. The platform also has two plastic ball caster wheels to help with support and balance but allow for slip. The base also holds the three IR sensors for detecting the relative position of obstacles. These were mounted using more balsa wood that was bonded to the base using adhesives. The motor driver is attached to the base using M3 standoffs. The entire drive assembly was placed on the base layer to keep it together. Apart from the lipo battery which was put above one layer to make it easy to remove.

The second layer holds the Mega, the power distribution, and the motor lipo battery. This layer is partially circular but has the front cut back slightly to allow for easier access to the bottom of the platform. The center of the layer has a large hole cut out to allow the wires from the base layer to easily pass through to the Mega, lipo battery, and power distribution board. The Mega, power distribution board, and the lipo battery are attached to the layer using Velcro and adhesives.

The third layer holds the ODROID and the ODROID lipo battery. This layer is cut back very far to allow easy access to the second layer and the Mega. The ODROID and its battery are attached to the layer using Velcro and adhesives.

The final layer of the two robots are where they primarily differ. Red Bot has a camera on a pan and tilt mechanism driven by two servos to track Blue Bot. Blue Bot has a pink cylinder of known dimensions on top of it to enable easy tracking. This layer is cut back to allow the antenna from the Panda wireless USB adapter on the ODROID to easily sit upright and have the best transmitting ability. This layer design is identical to the third layer design on both robots. The cylinder is attached to Blue Bot using Velcro and the pan and tilt mechanism is attached to red bot using adhesives. A picture of the layout of Blue Bot is shown in Figure 2. A picture of the layout of Red Bot is shown in Figure 3.

It became difficult to attach anything to the robots after they were constructed. This was not the best way to attach everything but it works. I would have rather drilled holes for everything and had more solid connections but a final design was not known at the time and there was not a way to know what the final design would look like. Another thing that has become obvious is not having suspension on the robots causes them to rock back and forth on initial acceleration from stopped position. This becomes irritating but due to the current models wheel layout and the wheels themselves a simple solution is not available. It does not have a large effect on the robots but it is irritating. Originally the platform was made to not rock but after getting stuck on the lip of some

6 carpet because there was no suspension the rocking was allowed. In the future suspension will be considered.

Figure 2. Image of Blue Bot. Can see each layer and its respective components.

7

Figure 3. Image of Red Bot. Can see each layer and its respective components.

8

Actuation Red Bot and Blue Bot both drive their wheels by two twelve volt Pololu brushed motors. These motors are geared to reduce rotational speed and increase torque. The motors also have built in quadrature rotary encoders to give feedback on the rotational rate of the motors. The motors are driven by Pololu dual bidirectional motor drivers each with two full H-Bridges. Each H-Bridge accepts a two bit number to give forward or reverse direction and allow for braking to the positive reference or the ground reference. The motor drivers accept a pulse width modulated signal (PWM) to give effort level to the output. The motor drivers also give feedback on the amount of current being drawn. These PWM signals can be as high as 20 KHz which is nice because it is out of the audible range.

Red Bot and Blue Bot use their Megas to drive the motors using a PID velocity controller in software that passes an effort level to the Mega. Red Bot’s Mega uses a PID position controller for the two servos used to operate the pan and tilt mechanism holding Red Bot’s camera. The servo controllers are convenient for the project but also hinder the accuracy of the tracking. In the future a motor and rotary encoder will be used to control the pan and tilt mechanism. The servo hinders the tracking because it requires the signal to be an angle that is directly desired. This angle is passed by keeping a low frequency signal high for a certain amount of time. This limits the overall resolution possible. However a rotary encoder is only limited by the number of counts it is capable of. Additionally it would be possible to easily include a velocity and position controller instead of just a position controller. This will in the future allow for smoother tracking.

It was difficult to generate good command signals for the servos and motors because the loop time was so low. The 10 Hz loop time does not work well at all for tracking or velocity control. It works okay but the system has large steady state error because the integral gain of the PID cannot be turned too high otherwise wind up occurs because of the low loop rate. The proportional gain also cannot be turned too high otherwise the system goes unstable because the oscillations get too great. The derivative gain has the same problem. Overall the system loop time needs to be increased well over ten times more. Unfortunately the IR loop times cannot exceed 20 Hz. In the future these will be placed in a condition to not sample them every time. This will allow for better control. Another problem with control was the driving frequency not being above the audible range. This caused the motors to be less stable. The frequency the motors are currently driven is around 4000 Hz. This is too low for driving these motors and it shows in the stability of the system. This compounds with the low loop time to greatly decrease the controllability of the motors. This will be fixed in the future.

The code for the robots control is found in the appendix. The hardware code on the Megas is the red_bot_v1.ino and blue_bot_v1.ino. These have all the code for driving the motors, reading the encoders, reading the servos, driving the servos, and reading the IR proximity sensors.

The higher level code for driving the motors is broken up into many parts. The three IR sensors on each robot generate a subtractive velocity command depending on how far away they read they are from any object. The value for the IR intensity is passed from the Mega. Each sensor has its own program. These can be found in the appendix as front_side_error_v3.py, right_side_error_v3.py, and left_side_error_v3.py. These programs use hyperbolic tangents to generate the error commands by subtracting the current position measured from a minimum

9 position allowed. This keeps the vehicle off of objects. The hyperbolic function is shaped using a scalar and the result has one subtracted from it. Using this technique it is possible to shape the response so when everything is far away it has no effect but as it gets closer the effect gets stronger until it saturates at the exact distance from the obstacle.

The formula is shown as:

푐푚푑 = 푡푎푛ℎ(1.75 ∗ (푠푒푛푠표푟_푟푒푎푑 − 푚𝑖푛_푑𝑖푠푡푎푛푐푒)) – 1 where the minimum distance used was 12 inches, the sensor read was the read voltage converted to distance, and command is the output command. This leads to the response shown in Figure 4.

Figure 4 . Plot of the response for 푐푚푑 = 푡푎푛ℎ(1.75 ∗ (푑𝑖푠푡푎푛푐푒 − 12)) − 1. As shown the response saturates exactly at 12 but the distance has almost no effect until it’s within three inches.

The current motor speed is calculated for each motor by reading in the counts and counts time from the encoders which is passed from the Mega. This value gives counts per second which is converted to inches per second. These programs can be found in the appendix as right_motor_measured_speed.py and left_motor_measured_speed.py.

In manual mode a proportional controller is used to convert joystick commands into desired velocity commands. The left thumbstick forward or reverse power and the right thumbstick turning power. The controller decides whether to be in autonomous mode or manual mode by pressing a

10 button on the controller. This button must be pressed for autonomous mode otherwise it’s in manual mode. This is done to prevent accidents from happening where the controller is dropped or something and the robots run into something in testing. Having a finger on the kill button seems like the best choice for starting. Nobody can predict what will happen and it is much faster to let go than to try and press something. The controller is read using the joy package in ROS. The controller commands are split up to the two robots using the controller_splitter.py program which can be found in the appendix. This allows the robots to be driven independently using the same joysticks. If the right bumper is pressed commands go to Red Bot. If the left bumper is pressed commands go to Blue Bot. If both are pressed commands go to both and if the a button is pressed they go autonomous.

All of these programs pass their values to the main controller called tele_motor_bluebot.py and tele_motor_redbot.py which can be found in the appendix. In this program the command values from the controller or the IR sensors are converted into desired velocity commands. This desired velocity is then passed to a closed loop PID controller which uses the measured velocity programs for feedback. In the program the max velocity is 18 inches per second. In controller mode the thumbsticks generate some desired velocity proportional to the max. In autonomous mode the IR increase or decrease the desired velocity to avoid obstacles as previously shown in Figure. 2. The gains used for testing are shown in Table I.

Table I Gains Used for PID Control of Motors

푘푝 푘푑 푘푖 0.012 0.00008 0.0

A shown in Table I the integral gain was set to zero. This was done because the system sampling speed was too low and integrator wind up occurred if the gain was high enough to take any effect. For this reason it was set to zero.

These gains resulted in about four inches per second steady state error. In the future a better digital controller will be used. This controller allowed the system to run at a stable state but the system could be much better if the peak time and rise time were reduced. To do this accurately system step data can be taken and used but for the current iteration there is not enough time to do this. The major issue is with the sampling time which really needs to increase but a lot of time is required to fix the sampling rates for everything in a way where the network remains stable as well as the robots. The problem is mainly with determining the best way to pass information over the network.

Sensors The ZMAPS robot network is designed to accomplish the task of two dimensional map building. As stated previously Red Bot has a camera on its highest platform mounted to a pan and tilt mechanism. The pan and tilt mechanism allows Red Bot to easily look around at its environment. As previously stated Blue Bot has a colored object, a 9 inch by 9 inch pink cylinder, on its top platform. The cylinder and the camera on the pan and tilt allows Red Bot to find and track Blue Bot as Blue Bot drives through a room. The location of Blue Bot is reported as the location of Blue Bot relative to the location of Red Bot.

11

Red Bot uses image processing techniques to find the colored object in an image. It will then use the pinhole camera model to approximate the distance to the object. This location of Blue Bot is then passed to the laptop. The laptop will be building a map using the location values sent by Red Bot. The map will show all the locations that Blue Bot travels and also the places it does not travel allowing for wall and obstacle locations to be determined.

The camera used by Red Bot is a Play Station 3 Eye (PS3 Eye) camera. It works out of the box with Ubuntu. The camera was chosen because it is inexpensive but can run at 60 frames per second when the image is 640x480 pixels. It can also run at 125 frames per second when the image is 320x240. The configuration for Red Bot uses the camera in the 320x240 configuration at 30 fps. This was done because the U3 was not able to process the code at the full frame size very fast but can easily run it at this frame size. The camera is mounted on the pan and tilt mechanism driven by two servos allowing Red Bot to easily follow Blue Bot around.

The pan and tilt mechanism is designed specifically for the PS3 Eye camera. As shown in Figure 5 the mechanism consists of the tilt mount, the pan mount, the camera, and the two servos.

The tilt mount and the pan mount are both made using an additive machine. The components are made of Polycarbonate (PC). Because the parts are made from PC on the machine at a lower density build they have lower stress carrying abilities. As shown in Figure5 the servo horn mount point for the pan mount has an adhesive foam tape to help hold it together. Although adhesives mainly work in shear this was still applied to help if the PC started to crack, which it did. Adhesive foam tape was also added to the tilt mount where the tilt servo is attached. The screws used to mount the tilt servo were not self-tapping and caused the small piece of wall between the servo and the holes to slightly crack. The servo is secure but the foam tape was added for extra support. The tilt mount was designed to mount to the existing holes in the printed board for the camera as shown in Figure 2. It used the PS3 Eye screws to fasten the printed board to the mount. The tilt servo horn has a cut out in the tilt mount. This takes the load from the screws and allows the horn to carry the load. The tilt servo itself and the pan servo horn also have a cut out in the pan mount for the same reason. The entire assembly is held securely to the highest level using the foam tape. The pan and tilt was designed to allow the Red Bot to rotate the camera about the axes of the lens that is that the lens center sits in place as the camera rotates about both axes. This allows the camera tracking to be more accurate and keeps images much more smooth and stable.

12

Figure 5. This image shows a view of the pan mount, the tilt mount, the camera, and the two servos assembled on Red Bot’s highest level.

The object sits on top of Blue Bot as shown in Figure 2. A 9 inch by 9 inch cylinder was chosen because it appears as almost a square from Red Bots viewing angle. This allows for the aspect ratio of the object to be used as a tracking feature. The color pink was chosen because it stands out very well against most other colors. The cylinder size was chosen because that is about as big as a cylinder that can fit on Blue Bot.

The program used by Red Bot is called camera_centroid_radius_2.py and can be found in the appendix. The image processing is done using Open Source Computer Vision (OpenCV) Python libraries. The image processing has gone through multiple stages and is upon media day will be in its tenth iteration. The code runs faster than the network can run so it was decided that was fast

13 enough. The current implementation of image processing does very well tracking the object across a large lighting gradient as long as the object remains almost upright. This is done by preventing the camera from automatically adjusting the gain but allowing the white balance and exposure to be automatic. Before the ball was lost easily if the light gradient was too large but a solution to this was found in making the process not as dependent on lightness or darkness. This was primarily done by increasing the hue of the images read in. By doing this each image could have the entire range of value and be filtered by saturation and hue alone. This allowed for large lighting changes. This also helped immensely in rooms with incandescent light. The effects of the yellow light was found to be greatly reduced by shifting the image hue.

The image is then filtered using a low pass Gaussian filter defined by a 5x5 neighborhood size. That is it looks at a small portion of the image and determines values for a pixel depending on its neighbors in the small portion of the image. The small portion is called a kernel. The filter removes high frequency noise in the image and effectively blurs the image. Removing the noise allows for better color tracking because the object becomes more consistent in color.

The now blurred image is then converted from the default color values OpenCV uses Blue, Green, and Red (BGR) to Hue, Saturation, and Value (HSV). The image thresholding is easier to perform in HSV or other similar methods on computers. HSV allows for the change in color to be more of tints and shades of color with pink being to the far right on the saturation values. The HSV image is put through a threshold and afterward only the pink or close to pink tints and shades remain. The camera settings for an image and the thresholding values found to work in varying lighting conditions to track the pink cylinder are shown in Table II.

Table II Camera Setting for Images and the Theshold Values Used Brightness 0.0 Contrast 35.0 Saturation 65.0 Hue 120.0 Gain 15.0 Blur Value 21.0 Upper Hue Threshold 179.0 Upper Saturation Threshold 255.0 Upper Value Threshold 255.0 Lower Hue Threshold 136.0 Lower Saturation Threshold 60.0 Lower Value Threshold 0.0

The result of this process produces a binary image consisting of white true values and black false values. True meaning the pixel passed the HSV value threshold conditions. This process despite being previously blurred still has noise. An attempt to remove noise is done by eroding and dilating the image using a 5x5 neighbor size. Eroding looks through a pixel’s neighbors, defined by a kernel, and if all the neighbors are true then the pixel is also true. Eroding removes a large amount of noise but also a large amount of the cylinder. This loss is gained back by dilating the image. Dilating

14 looks through a pixel’s neighbors, defined by a kernel, and if any of the neighbors are true then the pixel is also true. This restores a large amount of the lost image. The now eroded and dilated image is a binary mask of the original BGR image where the pixels that are true represent the desired shades and tints remaining, pink in this case.

The contours in the new binary image are found and the contour with the largest area is taken to have its aspect ratio checked. If the ratio is within 40% of 1 the object is assumed to be the cylinder. If it is not that object is thrown out and the next largest area is checked. If an object passes the two tests then it is assumed the cylinder and the centroid is used to generate an angle error between the centroid and center of frame. The camera internal values in pixels were previously determined and are shown in Table III.

Table III Internal Camera Values in Pixels Focal Length Width Height 279.667 319.5 239.5

Using these values it is possible to use right triangle relations shown by the pinhole camera model to determine the angle error between the two locations. This model is shown in Figure 6.

Figure 6. Pin hole camera model. Shows that if the focal length f and the height y1 are known it is possible to determine the angle of the triangle which is the tracking error when point Q is the centroid of the cylinder and the projection of point Q on the X3 axis is the center of the frame in pixels. Similarly if x1 is also known it is possible to solve for x3 using similar triangles. This is used to calculate the distance when the centroid and the center of frame are about the same.

As shown in Figure 6, if the difference between the center of frame and the centroid is known as well as the focal length it is possible to calculate the tracking angle error as:

푝 − 푐 휃 = 푎푡푎푛 ( 푥 푥) 푒푟푟표푟푝푎푛 푓

15 where px is Q and the centroid, cx is the projection of Q and the center of frame, f is the focal length, and 휃푒푟푟표푟푝푎푛 is the tracking error in the pan. The same formula can be applied for the tilt but using py and cy instead as:

푝푦 − 푐푦 휃 = 푎푡푎푛 ( ) 푒푟푟표푟푡푖푙푡 푓

The focal length was found not to change between the two axes.

Also shown in Figure 6, if the height or equally the width of the cylinder in pixels is known then it is possible to calculate the distance to the object because the height and width of the cylinder are known and the focal length in pixels is also known. This allows for the distance to be calculated as:

푦1 푥1 푥1푓 − = 표푟 푥3 = 푓 푥3 푦1

This relation can be applied to both the width and height. Which was done after fitting a minimum area rectangle to the contour in pixel space. This fit was done in the area and aspect ratio checking routine which is how the aspect ratio is checked. This is done to calculate the distance to the object and the two values are averaged together to find the distance. Experimentation has shown that the accuracy of the ball at 8 feet is approximately 3 inches. This uncertainty value was found to increase quickly after this distance which is shown in the maps later in this report.

By calculating the tracking error it was possible to keep the cylinder at the center of the frame. Using the tracking error angles a PID position controller was made to minimize the error. The program that took in the angle values and calculated the command is servo_tracking_v1.py and can be found in the appendix. The program took in the current pan angle and current tilt angle errors from the camera threshold program and used that for position commands in the PID controller. The gains for the PID controller are shown in Table IV.

Table IV Gains Used for PID Control of Pan and Tilt Servos

푘푝 푘푑 푘푖 0.15 0.0 0.0

The gains for the derivative and integral portions were both zero because the loop time was too low to run them faster. This is mainly because this loop could run as fast as it wanted but the Mega was still only updating at 10 Hz. This caused very slow but very large oscillations to occur. In the future a faster loop time will be tried as previously stated. In addition to this servos will not be used but instead brushed motors with rotary encoders will be used as previously stated. An image of an angle estimation and distance estimation is shown in Figure 7. Note that the cylinder is not the 9 inch by 9 inch cylinder but is its predecessor a 5 inch by 10 inch cylinder.

16

Figure 7. Image shows an example run calculating the error in the pan and tilt angles shown as thetax and thetay respectively. Also calculating the distance to the cylinder using the width and height shown as zw and zh respectively. The distance used was the average called z. Zdiff was the difference between the two. The frames per second was slowed down. The cx and cy values are the centroid of the cylinder shown as a blue dot. The white dot is the center of the image. The image is 640x480. A positive pan is to the left and a positive tilt is down. Which as shown both angles are negative because it needs to go right and up. The actual distance was around one meter as approximated. Note the dimensions are in millimeters for the distance measurements. This image also shows how the masking process removes all of the surroundings. The original image is shown in the bottom right and the masked image is in the top right. As shown, only the cylinder remains. The contour of the cylinder is outlined in green in the original image. Note that in this image the hue is normal, 90 degrees. It was not until after the current white light died a few days later that the issue of the yellow light previously mentioned was realized.

As shown in Figure 7 the cylinder is well tracked and the angles and distance measurements are accurate enough for tracking. This is largely shown by the accuracy in angle errors and distance measurements. The current measurements are off by a small amount but this is largely due to the center of the frame not being on the centroid. This shows the largest distortion for the camera, that is radial distortion. The radial distortion coefficients were found but the added calculations slowed 17 the program down too much to make the benefits worthwhile. It was found that because the pan and tilt were trying to maintain the centroid at center the centroid was typically close to the center. In this location the effects of radially distortion were minimal and the benefits associated with radial distortion compensation were equally minimal.

The other sensors, that is the IR and the encoders, were previously discussed in the actuation section as they are more related to that part of the robot.

Behaviors For now the Bots are placed in an unknown enclosed environment. Also for now Red Bot is placed in a way where it can easily observe the course, typically the bottom right corner because its easiest to explain. In the future it will find the optimal location but that was too much for the time available. Blue Bot is driving around randomly and when Red Bot sees Blue Bot it will begin tracking Blue Bot. Blue Bot will wander randomly avoiding obstacles using the previously described method of hyperbolic tangents.

As Blue Bot drives around its location will be passed to the computer by Red Bot using the previously described method to make distance measurements and using the current pan angle of the servo. With this information an occupancy grid will be formed showing all of the locations Blue Bot travels relative to Red Bot. If Blue Bot moves through a space twice it is assumed that there is nothing in that location. As Blue Bot drives around the locations it travels are recorded on an image which displays a blue marker in Blue Bots location and a red marker in Red Bots location. The map starts out all black and the locations Blue Bot passes over are increased in value, that is they become increasingly whiter. Every minute a sweep of the map is done to try and clean it up by closing, dilating then eroding, a mask of the pixels that are hit twice. After five minutes the image is closed one final time and the final mask is all that remains. The kernel size for closing is 3 pixels. In the image each pixel represents an inch so that equals 3 inches. This value was chosen because it is consistently smaller than the resolution of the system. The typical resolution is around 3 inches to 6 inches. An example course and the map are shown in Figure 8 and Figure 9.

18

Figure 8. Image of a course during a run. As shown Blue Bot left is driving around while Red Bot right tracks Blue Bot. A slant is along the bottom left. A box is in the top left. Red Bot sits in the top right. Two boxes block half the center.

Figure 9. The resulting occupancy map after about 10 minutes. As shown the bottom left shows the slant of the wood. The top left corner where the box is was kind of covered up. The boxes are clearly shown in the center. Also Red Bot is clearly shown in the top right. Most of the corners are rounded because Blue Bot avoids getting that close to anything.

19

The map shown in Figure 8 and Figure 9 shows that there is a large uncertainty associated with measurements of increasing distance. This is made clear by the box in the top left being covered slightly. And the roundness of the corners. In the future iterations Blue Bot will be making its own measurements allowing for increased mapping speed but also more accurate maps. Overall the maps turned out pretty well but they show a lot of room for improvement as previously mentioned.

Experimental Layout and Results Much of the experimentation was done while trying to tune the PID controllers and calibrate the IR sensors. A single IR was calibrated and assumed a good representation for all the IR sensors because they are all the same model. A plot of this curve is shown in Figure 10.

Figure 10. Plot of the IR data fit. The vertical is the distance measured in inches and the horizontal is the number returned by the analog to digital converter.

As shown in the plot the IR equation was determined to be 푑𝑖푠푡푎푛푐푒 = 7389.8(𝑖푟 푣푎푙푢푒)−1.074. This is found to give about 1 inch of accuracy over the ranges shown in Figure 10 which is accurate enough for the IR sensors purpose. That is to keep the robots from hitting things.

The other experiments performed were tuning the PID controllers which values are reported in Table I for the motor controllers and Table IV for the servos. A large amount of tuning went into the camera settings and threshold values for the hue, saturation, and value. The final values used are reported in Table II. Another experiment was finding the focal length and image size of the camera which is reported in Table III. Outside of these experiments and tuning the amount of data pushed over the network was also reduced drastically as previously discussed.

20

Conclusion Overall the mapping procedure turned out successful but there are many areas that need to be improved. Over the past four months I have learned a great amount about Linux, ROS, OpenCV, networks, and programming in general. I have learned the most about the Python language and have come to appreciate the simplicity of writing in an interpreted language. I have found that Python is fast but still easy to pick up and use. I would have liked to get better in ++ but there was not enough time to move everything over to c++ for this project which was my original goal. The maps my robots make have turned out to be off by about 2.5% at about 15 feet. Too much farther than this and the maps accuracy greatly decreases because of the resolution of the camera. In the future the resolution will not be as large an issue because the robots will be able to move through the space and build larger maps by stitching together smaller ones.

The update frequency proved to be the largest hurdle because it reduced the overall accuracy of tracking substantially. This in turn resulted in less accurate maps. The next iteration of the system is now being developed and will only be better after realizing these shortcomings.

I don’t know what I could have done differently to get the project going better initially besides knowing a bunch of things I didn’t know. I came into this project only just hearing about ROS and OpenCV. I was not used to using Linux or the terminal. Now I feel much more confident in all these things and programming in general. None of these things were simple but time spent on them made them much easier. I feel confident I went the good path in making the system but I could have not wasted so much time on little things and ran the system on the ODROIDs earlier. I wish I would have used github but I made iterations of software just fine. One thing I wish I would have done is broken the programs up even smaller but I tested things often enough. It could always be better though. Also learning to use the Transmission Control Protocol (TCP) and the Internet Protocol (IP) on my own would have been helpful but I feel confident letting ROS handle those things for me was a good choice.

I would have liked to spend more time on the mapping portion of the system but other things took much longer than I anticipated. A better computer and camera would be nice also but the hardware I used was okay. I wish I would of thought about suspension. Thanks for a great semester of learning.

21

Documentation  ODROIDC1: http://www.hardkernel.com/main/products/prdt_info.php?g_code=G141578608433&tab_ idx=2  ODROIDU3: http://www.hardkernel.com/main/products/prdt_info.php?g_code=G138745696275&tab_ idx=2  Arduino Mega 2560: http://arduino.cc/en/Main/ArduinoBoardMega2560  SharpGP2Y0A02YK0F https://www.sparkfun.com/datasheets/Sensors/Infrared/gp2y0a02yk_e.pdf  Pololu Motor Driverhttps://www.pololu.com/product/708/resources  Pololu Motor and Encoder https://www.pololu.com/product/1442/specs

Appendices  The project website contains a list of the sensors, actuators, and boards being used in this project in the informal proposal section of the website. It also contains links to the packages and libraries mentioned in the report. The website address is: https://sites.google.com/site/zacksimdlmappingsquad/home

Program Code to run the Network: Red Bot:  Launch file for all the Red Bot programs: red_bot.launch  Front IR error command generator: front_side_error_v3.py  Right IR error command generator: right_side_error_v3.py  Left IR error command generator: left_side_error_v3.py  Left motor measured speed generator: left_motor_measured_speed.py  Right motor measured speed generator: right_motor_measured_speed.py  Visual feedback generator, finds the object: camera_centroid_and_radius_2.py  Servo command generator: servo_tracking_1.py  Motor command generator: tele_motor_redbot_1.py  Serial communication driver between Odroid and Arduino ROS package: serial_node.py  Low level driver for all the sensors: red_bot_v1.ino

Blue Bot:  Launch file for all the Blue Bot programs: blue_bot.launch  Front IR error command generator: front_side_error_v3.py  Right IR error command generator: right_side_error_v3.py  Left IR error command generator: left_side_error_v3.py  Left motor measured speed generator: left_motor_measured_speed.py  Right motor measured speed generator: right_motor_measured_speed.py  Motor command generator: tele_motor_bluebot_1.py  Serial communication driver between Odroid and Arduino ROS package: serial_node.py  Low level driver for all the sensors: blue_bot_v1.ino

22

Laptop:  Xbox controller driver ROS package: joy_node  Xbox controller command generator: controller_splitter.py  Map maker: map_display_2.py  Map viewer: map_viewer_1.py red_bot.launch:

23

type="servo_tracking_v1.py" name="pan_and_tilt" ns="red_bot" />

24 front_side_error_v3.py: #!/usr/bin/env python import rospy from math import tanh from random import random from std_msgs.msg import UInt16 from std_msgs.msg import Float32 import numpy as np

# This node "front_error_node" subscribes to the front IR sensor, converts that value # to a measurement in inches. Then it publishes the error found from the desired # minimum to the topic "front_error". front_ir_read = np.zeros((1,2)) # function: front_ir_to_error # purpose: # 1) Read in values from the front ir publishers "front_ir". # 2) Convert that value to a inches measurment using converts ir to in float 5357.5113406476 x^(- 1.0171714166). # 3) Subtract that measurement from the minimum which will be 12 inches. If the subtraction shows # the value to be greater than 0 coerce it to 0. Regardless add 1. # 4) Take the inverse and bring it to some power, I think 1 will work for now. def front_ir_to_error(data): global front_ir_read min_distance = 12 # value from sensor sensor_read = data.data # if the value read in was a 0 will set the effort small by making a large read value if sensor_read > 0: # reading and convertin the value sensor_read = 9810.18078*pow(sensor_read,-1.12588) else: sensor_read = 30 # coercing the value if it is less than the minimum if min_distance > sensor_read: sensor_read = min_distance # print "\n" for index_of in xrange(0,front_ir_read.shape[1]-1): front_ir_read[0,index_of] = front_ir_read[0,index_of+1] # print "index: %r value: %r" % (index_of,front_ir_read[0,index_of]) # taking the inverse front_ir_read[0,front_ir_read.shape[1]-1] = tanh(1.75*(sensor_read - min_distance)) - 1 # print "index: %r value: %r" % (front_ir_read.shape[1]- 1,front_ir_read[0,front_ir_read.shape[1]-1]) # function: start

25

# purpose: # 1) Initialize the global variable for the publisher # 2) Initialize the publisher and set the topic name for the publisher to "front_error" # this value will be of the Float32 type, and set the que size to 10 # 3) Initialize the node publishing the topic "front_error". The node's name is "front_error_node". # 4) Set the update rate to be the same as the rate information is coming in. def start(): # erroL is the publisher global front_ir_read global error_front_pub # errorL is the publishing node error_front_pub = rospy.Publisher('front_error', Float32, queue_size=1) # starting the node rospy.init_node('front_error_node') # subscribing to the front ir topic rospy.Subscriber('front_ir', UInt16, front_ir_to_error)

r = rospy.Rate(10) while not rospy.is_shutdown(): error_front_pub.publish(np.mean(front_ir_read,axis=1)) r.sleep() if __name__ == '__main__': try: start() except rospy.ROSInterruptException: pass

26 right_side_error_v3.py: #!/usr/bin/env python import rospy from math import tanh from random import random import numpy as np from std_msgs.msg import UInt16 from std_msgs.msg import Float32

# This node "right_error_node" subscribes to the right IR sensor, converts that value # to a measurement in inches. Then it publishes the error found from the desired # minimum to the topic "right_error". right_ir_read = np.zeros((1,2)) # function: right_ir_to_error # purpose: # 1) Read in values from the right ir publishers "right_ir". # 2) Convert that value to a inches measurment using converts ir to in float 5357.5113406476 x^(- 1.0171714166). # 3) Subtract that measurement from the minimum which will be 10 inches. If the subtraction shows # the value to be greater than 0 coerce it to 0. Regardless add 1. # 4) Take the inverse and bring it to some power, I think 1 will work for now. def right_ir_to_error(data): global right_ir_read min_distance = 12 # value from sensor sensor_read = data.data # making it big if really small if sensor_read > 0: # reading and convertin the value sensor_read = 9810.18078*pow(sensor_read,-1.12588) else: sensor_read = 30 for index_of in xrange(0,right_ir_read.shape[1]-1): right_ir_read[0,index_of] = right_ir_read[0,index_of + 1] # coercing the value if it is less than the minimum if min_distance > sensor_read: sensor_read = min_distance # taking the inverse right_ir_read[0,right_ir_read.shape[1]-1] = tanh(1.75*(sensor_read - min_distance)) - 1

# function: start # purpose: # 1) Initialize the global variable for the publisher

27

# 2) Initialize the publisher and set the topic name for the publisher to "right_error" # this value will be of the Float32 type, and set the que size to 10 # 3) Initialize the node publishing the topic "right_error". The node's name is "right_error_node". # 4) Set the update rate to be the same as the rate information is coming in. def start(): # erroL is the publisher global right_ir_read global error_right_pub # errorL is the publishing node error_right_pub = rospy.Publisher('right_error', Float32, queue_size=1) # starting the node rospy.init_node('right_error_node') # subscribing to the right ir topic rospy.Subscriber('right_ir', UInt16, right_ir_to_error)

r = rospy.Rate(10) while not rospy.is_shutdown(): error_right_pub.publish(np.mean(right_ir_read,axis=1)) r.sleep() if __name__ == '__main__': try: start() except rospy.ROSInterruptException: pass

28 left_side_error_v3.py: #!/usr/bin/env python import rospy import numpy as np from math import tanh from random import random from std_msgs.msg import UInt16 from std_msgs.msg import Float32

# This node "left_error_node" subscribes to the left IR sensor, converts that value # to a measurement in inches. Then it publishes the error found from the desired # minimum to the topic "left_error". left_ir_read = np.zeros((1,2))

# function: left_ir_to_error # purpose: # 1) Read in values from the left ir publishers "left_ir". # 2) Convert that value to a inches measurment using converts ir to in float 5357.5113406476 x^(- 1.0171714166). # 3) Subtract that measurement from the minimum which will be 12 inches. If the subtraction shows # the value to be greater than 0 coerce it to 0. Regardless add 1. # 4) Take the inverse and bring it to some power, I think 1 will work for now. def left_ir_to_error(data): global left_ir_read min_distance = 12 # value from sensor sensor_read = data.data # if the value read in was a 0 will set the effort small by making a large read value if sensor_read > 0: # reading and convertin the value sensor_read = 9810.180785*pow(sensor_read,-1.12588) else: sensor_read = 30 # coercing the value if it is less than the minimum if min_distance > sensor_read: sensor_read = min_distance for index_of in xrange(0,left_ir_read.shape[1]-1): left_ir_read[0,index_of] = left_ir_read[0,index_of + 1] # taking the inverse left_ir_read[0,left_ir_read.shape[1]-1] = tanh(1.75*(sensor_read - min_distance)) - 1

# function: start # purpose:

29

# 1) Initialize the global variable for the publisher # 2) Initialize the publisher and set the topic name for the publisher to "left_error" # this value will be of the Float32 type, and set the que size to 10 # 3) Initialize the node publishing the topic "left_error". The node's name is "left_error_node". # 4) Set the update rate to be the same as the rate information is coming in. def start(): # erroL is the publisher global left_ir_read global error_left_pub # errorL is the publishing node error_left_pub = rospy.Publisher('left_error', Float32, queue_size=1) # starting the node rospy.init_node('left_error_node') # subscribing to the left ir topic rospy.Subscriber('left_ir', UInt16, left_ir_to_error)

r = rospy.Rate(10) while not rospy.is_shutdown(): error_left_pub.publish(np.mean(left_ir_read,axis=1)) r.sleep() if __name__ == '__main__': try: start() except rospy.ROSInterruptException: pass

30 left_motor_measured_speed.py: #!/usr/bin/env python import rospy from std_msgs.msg import Int16 from std_msgs.msg import Float32

# This node "left_motor_speed_node" takes in the speed and count for the left encoder # then finds the linear velocity. motor_time = 1 motor_count = 1 speed = 1

# function: left_motor_time # purpose: # 1) read in the time # 2) save the time to a value def left_motor_time(data): # reading and converting the value global motor_time motor_time = data.data

# function: left_motor_count # purpose: # 1) read in the time # 2) save the time to a value def left_motor_count(data): # reading and converting the value global motor_count motor_count = data.data

# function: start() # purpose: # 1) initialize variables # 2) initialize node # 3) initialize publisher # 4) initialize subscribers # 5) compute speed using counts and time # 6) publish speed def start(): global motor_time global motor_count global speed_pub global count_sub global time_sub global speed

31

# starting the publishing node speed_pub = rospy.Publisher('left_motor_speed', Float32, queue_size=1)

# starting the node rospy.init_node('left_motor_speed_node')

# subscribing to the left motor counts count_sub = rospy.Subscriber('left_encoder_count', Int16, left_motor_count)

# subscribing to the left motor time time_sub = rospy.Subscriber('left_encoder_time', Int16, left_motor_time)

# MUST DO THIS FOR SUBSCRIBER PUBLISHER PROGRAMS TO UPDATE MORE THAN ONCE # DO NOT USE SPIN # setting the publishing rate r = rospy.Rate(10)

while not rospy.is_shutdown(): # calculating the speed speed = (motor_count * 3.54 * 3.14159 * 1000000)/(1200.0 * motor_time) #publishing the speed speed_pub.publish(speed)

r.sleep() if __name__ == '__main__': start()

32 right_motor_measured_speed.py: #!/usr/bin/env python import rospy from std_msgs.msg import Int16 from std_msgs.msg import Float32

# This node "right_motor_speed_node" takes in the speed and count for the right encoder # then finds the linear velocity. motor_time = 1 motor_count = 1 speed = 1

# function: right_motor_time # purpose: # 1) read in the time # 2) save the time to a value def right_motor_time(data): # reading and converting the value global motor_time motor_time = data.data

# function: right_motor_count # purpose: # 1) read in the time # 2) save the time to a value def right_motor_count(data): # reading and converting the value global motor_count motor_count = data.data

# function: start() # purpose: # 1) initialize variables # 2) initialize node # 3) initialize publisher # 4) initialize subscribers # 5) compute speed using counts and time # 6) publish speed def start(): global motor_time global motor_count global speed_pub global count_sub global time_sub global speed

33

# starting the publishing node speed_pub = rospy.Publisher('right_motor_speed', Float32, queue_size=10)

# starting the node rospy.init_node('right_motor_speed_node')

# subscribing to the right motor counts count_sub = rospy.Subscriber('right_encoder_count', Int16, right_motor_count)

# subscribing to the right motor time time_sub = rospy.Subscriber('right_encoder_time', Int16, right_motor_time)

# MUST DO THIS FOR SUBSCRIBER PUBLISHER PROGRAMS TO UPDATE MORE THAN ONCE # DO NOT USE SPIN # setting the publishing rate r = rospy.Rate(10)

while not rospy.is_shutdown(): # calculating the speed speed = (motor_count * 3.54 * 3.14159 * 1000000)/(1200.0 * motor_time)

#publishing the speed speed_pub.publish(speed)

r.sleep() if __name__ == '__main__': start()

34 camera_centroid_and_radius_2.py: #!/usr/bin/env python import rospy import cv2 import numpy as np import time from numpy import nonzero #import cv from math import degrees from math import atan2 from math import fabs from std_msgs.msg import Int16 from std_msgs.msg import Int32 from std_msgs.msg import Float32 from sensor_msgs.msg import Image from cv_bridge import CvBridge, CvBridgeError import sys import roslib roslib.load_manifest('redfish') sys.path.append('/usr/local/lib/python2.7/site-packages') def nothing(x): pass

# Intializes everything and publishes stuff def start(): #print "entered start" # the command is guvcview # setting up the camera to video1 where the ps3eye camera is attached. cap = cv2.VideoCapture(0) cap.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,320) cap.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,240) cap.set(cv2.cv.CV_CAP_PROP_FPS,30)

# initializing color values # brightness b = 0.0/256.0 # contrast c = 35.0/256.0 # saturation s = 65.0/256.0 # hue h = 120.0/180.0 # gain g = 15.0/64.0 # gaussian blur value

35 blur_val = 1 #setting the camera parameters cap.set(cv2.cv.CV_CAP_PROP_BRIGHTNESS,b) cap.set(cv2.cv.CV_CAP_PROP_CONTRAST,c) cap.set(cv2.cv.CV_CAP_PROP_SATURATION,s) cap.set(cv2.cv.CV_CAP_PROP_HUE,h) cap.set(cv2.cv.CV_CAP_PROP_GAIN,g)

# initializing pink threshold values # upper threshold upper_p_hue = 179 upper_p_sat = 255 upper_p_val = 255 # lower threshold lower_p_hue = 136 lower_p_sat = 60 lower_p_val = 0

# arrangeing the threshold values into an array upper_p_thresh = np.array([upper_p_hue,upper_p_sat,upper_p_val]) lower_p_thresh = np.array([lower_p_hue,lower_p_sat,lower_p_val])

# kernel for the eroding and dialation routines kernel = cv2.getStructuringElement(cv2.MORPH_ELLIPSE,(5,5))

# center for the frame in pixels cx = 319.5/2.0 cy = 239.5/2.0 # focal lengths in pixels f = (5.5933430628532710e+02)/2.0 # actual diameter of the cylinder p_w = 9*25.4 # actual height of the cylinder p_h = 9*25.4 # moment values for a pink contour p_moment = 0 # centroid values for a pink contour in pixels p_px = 0 p_py = 0 # area of a contour p_area = 0 # values for the minimum enclosing rectangle p_fit_x = 0 p_fit_y = 0 p_fit_w = 0 p_fit_h = 0

36 p_fit_a = 0 p_fit_center = 0 # error for the pan angle between object center and center of frame p_thxe = 0 # error for the tilt angle between object center and center of frame p_thye = 0 # distance as seen from the width triangle p_zw = 0 # distance as seen from the height triangle p_zh = 0 # distance average p_z = 0 # distance difference between the two p_z_diff = 0

# time stamp values to get loop time t1 = 0 t2 = 0 time_diff = 0

### initializing the publishers for the two motors ### thx_pub = rospy.Publisher('pan_servo_error', Float32, queue_size=1) thy_pub = rospy.Publisher('tilt_servo_error', Float32, queue_size=1) z_pub = rospy.Publisher('distance', Float32, queue_size=1) area_pub = rospy.Publisher('area', Int32, queue_size=1) image_pub = rospy.Publisher('image_out', Image, queue_size=1) ### starting the node rospy.init_node('object_position') print "initialized nodes" bridge = CvBridge() loop_rate = 10 rate_index = 0 r = rospy.Rate(loop_rate) while not rospy.is_shutdown(): # reading in the image is_read, read = cap.read() #print is_read t1 = t2 t2 = time.time() time_diff = t2- t1 # print the frames per second print "\nfps = %r" % (1.0/time_diff) if is_read:

37

print "read successful"

# making the blurring value odd val = blur_val * 2 + 1

# smoothing the read image using a low pass gaussian filter to remove noise image = cv2.GaussianBlur(read,(val,val),0)

# Convert BGR image to a HSV image to enable easy thresholding for a specific shade hsv = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)

# masking the HSV image to get only certain shades of color using the threshold values mask_p_before = cv2.inRange(hsv, lower_p_thresh, upper_p_thresh)

# opening the image to filter out noise, opening is another name for eroding then dialating mask_p = cv2.morphologyEx(mask_p_before, cv2.MORPH_OPEN, kernel)

# masking the image to write out filtered_p_imageout = cv2.bitwise_and(image,image,mask = mask_p)

# trying to get the contours in the new image of only edges contours_p, hierarchy_p = cv2.findContours(mask_p,cv2.RETR_LIST,cv2.CHAIN_APPROX_NONE)

#getting the moments of the contours if the contour array has values if len(contours_p) > 0: print "entered contrours" #print "\nlength = %r" % len(contours_p) # creating a temporary array the length of the contours temp_areas_p = np.ones(len(contours_p)) i = 0 # getting all the areas while i < len(temp_areas_p): # getting the area of the contour area = cv2.contourArea(contours_p[i]) #print "\t area = %r" % area temp_areas_p[i] = area #print temp_areas_p[i] #print i i = i + 1 i = 0 # getting the maximum area that has an aspect ratio that is close to 1 while i < len(temp_areas_p): # checking if the current value is the max if temp_areas_p[i] == max(temp_areas_p): (t_x,t_y),(t_w,t_h),t_a = cv2.minAreaRect(contours_p[i])

38

if t_h > 0: t_h = t_h else: t_h = 0.0001 t_aspect = t_w/t_h #print t_aspect # if the aspect the value was found to be max area will now check the aspect # ratio to see if it is close to 1. If it is keep the value otherwise # delete the value from the area and contour arrays then start over if (t_aspect>0.40) and (t_aspect<1.60): break else: contours_p.pop(i) temp_areas_p = np.delete(temp_areas_p,i,None) i = 0 else: i = i + 1 if len(temp_areas_p) <= 0: break

if len(temp_areas_p) > 0: # getting the bounding rectangle of the contour #print i pr_fit_x,pr_fit_y,pr_fit_w,pr_fit_h = cv2.boundingRect(contours_p[i])

# getting the minimum enclosing rectangle # the return goes: center = (cx,cy), sides = (width,height), angle not sure (p_fit_x,p_fit_y),(p_fit_w,p_fit_h),p_fit_a = cv2.minAreaRect(contours_p[i])

p_cx = int(p_fit_x) p_cy = int(p_fit_y)

#print "Pink Values" #print "\tmax area: %r" % temp_areas_p[i] #print "\t\tcx,cy %r,%r" % (p_cx,p_cy) #print "\t\t\tpixel width = %r \n\t\t\t\tpixel height = %r" % (p_fit_w,p_fit_h) #print "\t\t\t\t\tangle = %r" % p_fit_a # getting the minimum enclosing circle #(tx,ty),tr = cv2.minEnclosingCircle(contours_p[i]) # drawing the circle #cv2.circle(image,(int(tx),int(ty)),int(tr),(0,0,255),2) # adding the pink contours found to the image with a green circle and the green with pink

# draw the minimum enclosing rectangle

39

cv2.rectangle(filtered_p_imageout,(pr_fit_x,pr_fit_y),(pr_fit_x+pr_fit_w,pr_fit_y+pr_fit_h),(0,0, 255),2) # drawing the minimum enclosing rectangle #cv2.drawContours(image,rect,0,(0,0,255),2) # draw the center of the minimum rectangle in blue cv2.circle(filtered_p_imageout,(p_cx,p_cy),2,(255,0,0),2) # updating the pan error angle error p_thxe = -1.0*degrees(atan2((p_cx - cx),f)) # updating the tilt error angle error p_thye = degrees(atan2((p_cy - cy),f)) #print "thetax = %r\n\tthetay = %r" %(thxe,thye) # updating the distance based on the width p_zw = p_w*f/p_fit_w # updating the distance based on the left value p_zh = p_h*f/p_fit_h # updating the average p_z = (p_zw+p_zh)/2.0 # updating the differenct p_z_diff = p_zh - p_zw # updating the area p_area = temp_areas_p[i] print "zw = %r\n\tzh = %r\n\t\tz = %r\n\t\t\tzdiff = %r" % (p_zw,p_zh,p_z,p_z_diff) # publishing the x angle error thx_pub.publish(p_thxe) # publishing the y angle error thy_pub.publish(p_thye) # publishing the z distance z_pub.publish(p_z) # publishing the area area_pub.publish(p_area) # drawing in the center of the image a large white dot to show visual error cv2.circle(filtered_p_imageout,(160,120),2,(255,255,255),3) # publishing the image if the index is at the final value for the loop rate if (rate_index > 2*loop_rate-1):

#image_pub.publish(bridge.cv2_to_imgmsg(filtered_p_imageout,encoding="passthrough")) rate_index = 0 else: rate_index = rate_index + 1 # displaying the three images #cv2.imshow('image',hsv) #cv2.imshow('regular image',image) #cv2.imshow('pink mask',filtered_p_image) #cv2.imshow('pink mask 2',filtered_p_image2) #cv2.imshow('pink mask out',filtered_p_imageout)

40

#cv2.imshow('pink mask f',mask_f)

if cv2.waitKey(1) & 0xFF == 27: break cv2.destroyAllWindows() r.sleep() if __name__ == '__main__': start()

41 servo_tracking_1.py: #!/usr/bin/env python import rospy import math import time from math import ceil import numpy as np from std_msgs.msg import Int16 from std_msgs.msg import Int32 from std_msgs.msg import Float32

# This ROS Node takes in the radius and centroid of the ball and produces pan and tilt # motor commands. The feedback control will be a proportional controller that generates # commands in a location in degrees

#### X WILL BE THE LEFT RIGHT PAN VALUE AND Y WILL BE THE UP DOWN TILT VALUE

# the measured pan angle thxa = 0 # the measured tilt angle thya = 0 # the error in pan angle thxe = 0.0 # the error in tilt angle thye = 0.0 # the desired angle for the pan thxd = 0.0 # the desired angle for the tilt thyd = 0.0 # area value area_val = 0

# function: get_pan_angle # purpose: gets the current pan servo angle topic "pan_servo_curent" # 1) read in the pan angle def get_pan_angle(data): global thxa # getting the data thxa = data.data # print "pan angle = %r" % thxa

# function: get_tilt_angle # purpose: gets the current tilt servo angle topic "tilt_servo_curent" # 1) read in the tilt angle def get_tilt_angle(data):

42

global thya # getting the data thya = data.data

# function: get_pan_error # purpose: gets the current pan servo angle error topic "pan_servo_error" # 1) read in the pan angle error def get_pan_error(data): global thxe # getting the data thxe = data.data

# function: get_tilt_error # purpose: gets the current tilt servo angle error topic "tilt_servo_error" # 1) read in the tilt angle error def get_tilt_error(data): global thye # getting the data thye = data.data

# function: get_area # purpose: gets the current area # 1) read in the area def get_area(data): global area_val # getting the data area_val = data.data

# Intializes everything def start(): ### declaring all the sensor global variables ### global thxa global thya global thxe global thye global thxd global thyd global area_val global pan_sub global tilt_sub global pan_er_sub global tilt_er_sub global area_sub

# area threshold for object to be tracked

43 area_thresh = 200

# previous angle for pan if image lost previous_pan = 1000

### initializing the publishers for the two servos ### pan_pub = rospy.Publisher('pan_servo_desired', Int16, queue_size=1) tilt_pub = rospy.Publisher('tilt_servo_desired', Int16, queue_size=1)

### starting the node rospy.init_node('pan_and_tilt')

### getting the values from the other programs. Start all the other topics before the joystick ### ### so it will use the most recent command to updat the desired velocity ### # subscribing to the IR and Ultra topics left_error, right_error, front_error, front_ultra_error pan_sub = rospy.Subscriber("pan_servo_current", Int16, get_pan_angle) tilt_sub = rospy.Subscriber("tilt_servo_current", Int16, get_tilt_angle) pan_er_sub = rospy.Subscriber("pan_servo_error", Float32, get_pan_error) tilt_er_sub = rospy.Subscriber("tilt_servo_error", Float32, get_tilt_error) area_sub = rospy.Subscriber("area", Int32, get_area)

# creating some arrays for the error terms errory = np.ones((1,2)) errory_derivative = np.ones((1,3)) errory_integral = np.ones((1,3)) errorx = np.ones((1,2)) errorx_derivative = np.ones((1,3)) errorx_integral = np.ones((1,3))

# time difference t1 = 0 t2 = 0

# pid gains kp = 0.15 kd = 0 ki = 0 r = rospy.Rate(10) while not rospy.is_shutdown(): previous_pan = thxd t1 = t2 t2 = time.time() time_diff = t2- t1 print "fps: %r" % (1.0/time_diff)

44

# checking if the area is greater than the threshold # if it is use the desired value # otherwise go center frame print "area = %r" % (area_val) if area_val > area_thresh: previous_meany = np.mean(errory,axis=1) previous_meanx = np.mean(errorx,axis=1) # shifting all the arrays one value to the left # if it is less than the length-1 of the arrays will update for index_of in xrange(0,errory.shape[1]-1): errory[0,index_of] = errory[0,index_of+1] errorx[0,index_of] = errorx[0,index_of+1] for index_of in xrange(0,errory_derivative.shape[1]-1): errory_derivative[0,index_of] = errory_derivative[0,index_of+1] errorx_derivative[0,index_of] = errorx_derivative[0,index_of+1] for index_of in xrange(0,errory_integral.shape[1]-1): errory_integral[0,index_of] = errory[0,index_of+1] errorx_integral[0,index_of] = errorx[0,index_of+1] # updating the last value for the error errory[0,errory.shape[1]-1] = thye errorx[0,errorx.shape[1]-1] = thxe # updating the last value for the derivative errory_derivative[0,errory_derivative.shape[1]-1] = (np.mean(errory,axis=1)- previous_meany)/time_diff errorx_derivative[0,errorx_derivative.shape[1]-1] = (np.mean(errorx,axis=1)- previous_meanx)/time_diff # updating the last value for the integral errory_integral[0,errory_integral.shape[1]-1] = thye errorx_integral[0,errory_integral.shape[1]-1] = thxe

# updating the desired pan angle # updating the desired tilt angle thyd = kp*np.mean(errory,axis=1) + kd*np.mean(errory_derivative,axis=1) + ki*np.mean(np.trapz(errory_integral,dx=time_diff,axis=1)) thxd = kp*np.mean(errorx,axis=1) + kd*np.mean(errorx_derivative,axis=1) + ki*np.mean(np.trapz(errorx_integral,dx=time_diff,axis=1)) print "kp: %r\n\tkd: %r\n\t\tki:%r"% (kp*np.mean(errory,axis=1),kd*np.mean(errory_derivative,axis=1),ki*np.mean(np.trapz(errory_ integral,dx=time_diff,axis=1))) else: print "nope" thxd = previous_pan thyd = 1000

print "pan actual = %r\n\tpan error = %r\n\t\tpan desired = %r" %(thxa,thxe,thxd+thxa) print "tilt actual = %r\n\ttilt error = %r\n\t\ttilt desired = %r" %(thya,thye,thyd+thya)

45

# publishing the desired angles pan_pub.publish(int(thxd)) tilt_pub.publish(int(thyd))

# delay r.sleep() if __name__ == '__main__': start()

46 tele_motor_redbot_1.py: #!/usr/bin/env python import rospy import time import numpy as np from random import random from std_msgs.msg import Int16 from std_msgs.msg import UInt16 from std_msgs.msg import Float32

# This ROS Node converts Joystick inputs from the joy node # into velocity commands for the motors. If the "A" button is pressed will be autonomous # otherwise will use the commands from the thumbsticks.

# declaring the global variables that hold values to be passed between functions left_ir_error_cmd = 0 right_ir_error_cmd = 0 front_ir_error_cmd = 0 left_motor_speed_desired = 0 right_motor_speed_desired = 0 left_motor_speed_act = 0 right_motor_speed_act = 0 left_motor_effort = 0 right_motor_effort = 0 left_thumb_value = 0 right_thumb_value = 0 a_button_value = 0 right_bumper_value = 0 is_blocked = False dice_roll = 0 lirs = False rirs = False firs = False rturn = False lturn = False t1 = 0 t2 = 0 time_diff = 0

# creating some arrays for the error terms

47 error_left = np.ones((1,2)) error_left_derivative = np.ones((1,3)) error_left_integral = np.ones((1,3)) error_right = np.ones((1,2)) error_right_derivative = np.ones((1,3)) error_right_integral = np.ones((1,3))

# function: desired_velocity # purpose: # 1) using the values being updated by the button topics and the IR topics calculates what the desired velocity # should be. Max velocity is set to 12 in/sec for now def desired_velocity_(): global left_ir_error_cmd global right_ir_error_cmd global front_ir_error_cmd

global left_motor_speed_desired global right_motor_speed_desired

global left_thumb_value global right_thumb_value global a_button_value global right_bumper_value

global is_blocked global dice_roll global lirs global rirs global firs global rturn global lturn

### teleop and autonomous velocity stuff ### # max desired velocity when using teleop mode in inches/sec v_teleop_max = 12 # max desited velocity when in autonomous mode v_auto_max = 0 # the front velocity error gain in autonomous mode KF = v_auto_max # turn gain KTG = 0.4 # the left and right velocity error gain in autonomous mode KL = v_auto_max * KTG KR = KL

48

### thumbstick values ### # the left thumb value velocity is the up or down value read from the controller ltv = v_teleop_max * left_thumb_value # the right thumb value velocity is the left or right value. Negative because the joy command negates rtv = (v_teleop_max*KTG) * right_thumb_value # the A button value is the value sent by the A button determining to be in autonomous or tele mode abv = a_button_value # the left bumber value will tell if it is desired to drive bot around bv = right_bumper_value

### temporary values and minimum velocity values # minimum velocity in/sec to turn on otherwise send 0 in/sec mv = 2 # left motor desired velocity temp lmv = 0 # right motor desired velocity temp rmv = 0

### checking the six possible thumbstick orientations and if A is pressed or not ### # left_thumb_up ltu = ltv > mv # left_thumb_down ltd = ltv < -mv # right_thumb_right rtr = rtv > mv # right_thumb_left rtl = rtv < -mv # left_thumb_off lto = (not ltu) and (not ltd) # right_thumb_off rto = (not rtr) and (not rtl) # A button pressed abp = abv > 0 # bumper pressed bp = bv > 0

### If any of the commands are above the threshold value want to entire the blocked procedure if (abp) and (not is_blocked): print "entered not is_blocked" ### checking if the ir commands are almost saturated for the left, right, and front IR values # left ir saturated lirs = left_ir_error_cmd < -0.99 # right ir saturated rirs = right_ir_error_cmd < -0.99

49

# front ir saturated firs = front_ir_error_cmd < -0.99 ### checking to see if any of the conditions for a problem are detected: # checking if all three are saturated allsat = lirs and rirs and firs # checking if the front is saturated but the sides are not saturated frsat = firs and not (lirs and rirs) # checking if the sides are saturated but not the front sisat = ( not firs ) and (lirs and rirs) # if any of the conditions are true will enter the blocked procedure and roll the dice to # determine the direction to turn out of the block if allsat or frsat or sisat: print "\tentered the is_blocked change spot" is_blocked = True # rolling the dice to determine which way to turn if greater than 50 turn right otherwise # turn left dice_roll = random() * 100 print "\t\tdice %r" % dice_roll # not using an elif because want this to happen immediately if the above is true # if the a button is pressed and the robot is blocked want to enter the check to see if # either of the two turn conditions are true. If they are not true will se the blocked # boolean back to False if abp and is_blocked: print "\t\t\tentered the next stage" ### checking if the ir commands are almost saturated for the left, right, and front IR values # left ir saturated lirs = left_ir_error_cmd < -0.75 # right ir saturated rirs = right_ir_error_cmd < -0.75 # front ir saturated firs = front_ir_error_cmd < -0.75

# checking the two possible block conditions if (rirs or firs) and (dice_roll > 50): print "\t\t\t\tentered the right turn" rturn = True lturn = False elif (lirs or firs) and (dice_roll <= 50): print "\t\t\t\tentered the left turn" lturn = True rturn = False else: print "\t\t\t\tentered the make false" rturn = False lturn = False is_blocked = False

50

# checking if the A button is pressed will go into autonomous mode to determine the desired # velocity otherwise will use teleop mode to determine the desired velocity if abp: # if the robot is blocked and a right turn was chosen then turning right at max turn # if the robot is blocked and a left turn was chosen then turning left at max turn # otherwise operating in normal autonomous mode if is_blocked and rturn: lmv = KL * 2.0 / 1.25 rmv = -1.0 * KR * 2.0 / 1.25 elif is_blocked and lturn: lmv = -1.0 * KL * 2.0 / 1.25 rmv = KR * 2.0 / 1.25 else: # calculate the desired velocity values lmv = v_auto_max - KL*left_ir_error_cmd + KR * right_ir_error_cmd + KF * front_ir_error_cmd rmv = v_auto_max + KL*left_ir_error_cmd - KR * right_ir_error_cmd + KF * front_ir_error_cmd

if (abs(lmv) < mv) and (abs(rmv) < mv): lmv = 0 rmv = 0 else: lmv = lmv rmv = rmv elif bp: # want to disable the block if I release the A button is_blocked = False # go forward straight: increase both positively # go backward straight: increase bot negatively if (ltu and rto) or (ltd and rto): lmv = ltv rmv = ltv # go forward and right: increase left motor positively and decrease right motor positively # go forward and left: increase right motor positively and decrease left motor positively elif (ltu and rtr) or (ltu and rtl): lmv = ltv + rtv rmv = ltv - rtv # go backward and left: increase right motor negatively and decrease left motor negatively # go backward and right: decrease right motor negatively and increase left motor negatively elif (ltd and rtl) or (ltd and rtr): lmv = ltv - rtv rmv = ltv + rtv # rotate right in place: increase left motor positively and increase right motor negatively

51

# rotate left in place: increase left motor negatively and increase right motor positively elif (lto and rtr) or (lto and rtl): lmv = rtv * 2.5 rmv = -rtv * 2.5 else: lmv = 0 rmv = 0 else: # want to disable the block if I release the A button is_blocked = False lmv = 0 rmv = 0 left_motor_speed_desired = lmv right_motor_speed_desired = rmv

# function: closed_loop_command # purpose: # 1) The loop will be closed using the calculated linear velocity published on the topics # "right_motor_speed" and "left_motor_speed". # 2) This error signal will pass through the controller which currently will just be a proportional # controller with gain KP def closed_loop_command(): # global variables global left_motor_speed_desired global right_motor_speed_desired global left_motor_speed_act global right_motor_speed_act global left_motor_effort global right_motor_effort

global time_diff

global error_left global error_left_derivative global error_left_integral global error_right global error_right_derivative global error_right_integral

# max output to divide command over max_output = 255

# current controller is just a proportional controller use KCP of 0.01 for now KCP = 0.015 KCD = 0.000 KCI = 0.001

52

# getting the previous mean for the derivative calc previous_mean_right = np.mean(error_right) previous_mean_left = np.mean(error_left)

# temporary variables for the error left_motor_error = left_motor_speed_desired - left_motor_speed_act right_motor_error = right_motor_speed_desired - right_motor_speed_act

# shifting all the arrays one value to the left # if it is less than the length-1 of the arrays will update for index_of in xrange(0,len(error_left_integral)-2): if index_of < len(error_left)-1: error_left[index_of] = error_left[index_of+1] error_right[index_of] = error_right[index_of+1] error_left_derivative[index_of] = error_left_derivative[index_of+1] error_right_derivative[index_of] = error_right_derivative[index_of+1] error_left_integral[index_of] = error_left_integral[index_of+1] error_right_integral[index_of] = error_right_integral[index_of+1] else: error_left_integral[index_of] = error_left_integral[index_of+1] error_right_integral[index_of] = error_right_integral[index_of+1]

# updating the last value for the error error_left[len(error_left)-1] = left_motor_error error_right[len(error_right)-1] = right_motor_error

# updating the last value for the derivative error_left_derivative[len(error_left_derivative)-1] = (np.mean(error_left)- previous_mean_left)/time_diff error_right_derivative[len(error_right_derivative)-1] = (np.mean(error_right)- previous_mean_right)/time_diff

# updating the last value for the integral error_left_integral[len(error_left_integral)-1] = left_motor_error error_right_integral[len(error_right_integral)-1] = right_motor_error

# updating the desired left and right commands left_motor_effort = KCP*np.mean(error_left) + KCD*np.mean(error_left_derivative) + KCI*np.mean(np.trapz(error_left_integral,dx=time_diff)) right_motor_effort = KCP*np.mean(error_right) + KCD*np.mean(error_right_derivative) + KCI*np.mean(np.trapz(error_right_integral,dx=time_diff))

# dividing over the available output left_motor_effort = left_motor_effort * max_output right_motor_effort = right_motor_effort * max_output

53

#### checking the cases for the effort # left motor saturated positively lmsp = left_motor_effort > max_output

# left motor saturated negatively lmsn = left_motor_error < (-1 * max_output)

# right motor saturated positively rmsp = right_motor_effort > max_output

# right motor saturated negatively rmsn = right_motor_error < (-1 * max_output)

# checking if the left motor is saturated and coercing it if it is if lmsp: left_motor_effort = max_output elif lmsn: left_motor_effort = (-1 * max_output) else: left_motor_effort = left_motor_effort

# checking if the left motor is saturated and coercing it if it is if rmsp: right_motor_effort = max_output elif rmsn: right_motor_effort = (-1 * max_output) else: right_motor_effort = right_motor_effort

# function: right_motor_speed_read # purpose: # 1) read "right_motor_speed" topic def right_motor_speed_read(data): # reading and converting the value global right_motor_speed_act right_motor_speed_act = data.data

# function: left_motor_speed_read # purpose: # 1) read "left_motor_speed" topic def left_motor_speed_read(data): # reading and converting the value global left_motor_speed_act left_motor_speed_act = data.data

54

# function: front_ir_read # purpose: # 1) read "front_ir" topic def front_ir_read(data): # reading and converting the value global front_ir_error_cmd front_ir_error_cmd = data.data

# function: left_ir_read # purpose: # 1) read "left_ir" topic def left_ir_read(data): # reading and converting the value global left_ir_error_cmd left_ir_error_cmd = data.data

# function: right_ir_read # purpose: # 1) read "right_ir" topic def right_ir_read(data): # reading and converting the value global right_ir_error_cmd right_ir_error_cmd = data.data

# function: right_thumb_read # purpose: # 1) Read "right_thumb" topic def right_thumb_read(data): global right_thumb_value right_thumb_value = data.data

# function: left_thumb_read # purpose: # 1) Read "left_thumb" topic def left_thumb_read(data): global left_thumb_value left_thumb_value = data.data

# function: right_bumper_read # purpose: # 1) Read "right_bumper" topic def right_bumper_read(data): global right_bumper_value right_bumper_value = data.data

55

# function: a_button_read # purpose: # 1) Read "a_button" topic def a_button_read(data): global a_button_value a_button_value = data.data

# Intializes everything def start(): ### declaring all the sensor global variables ### global left_ir_error_cmd global right_ir_error_cmd global front_ir_error_cmd global left_motor_speed_desired global right_motor_speed_desired global left_motor_speed_act global right_motor_speed_act global left_motor_effort global right_motor_effort global left_thumb_value global right_thumb_value global a_button_value global right_bumper_value global is_blocked ### declaring all the publusher and subscribers ### global left_motor_pub global right_motor_pub

global left_motor_desired_pub global right_motor_desired_pub

global left_thumb_sub global right_thumb_sub global a_button_sub global right_bumper_sub

global blocked_pub global left_ir_sub global right_ir_sub global fornt_ir_sub

global left_motor_sub global right_motor_sub

global time_diff global t1

56 global t2

### initializing the publishers for the two motors ### left_motor_pub = rospy.Publisher('left_motor_effort', Int16, queue_size=1) right_motor_pub = rospy.Publisher('right_motor_effort', Int16, queue_size=1) left_motor_desired_pub = rospy.Publisher('left_motor_desired', Int16, queue_size=1) right_motor_desired_pub = rospy.Publisher('right_motor_desired', Int16, queue_size=1) blocked_pub = rospy.Publisher('blocked',Int16,queue_size=1) ### starting the node rospy.init_node('motor_commands')

### getting the values from the other programs. Start all the other topics before the joystick ### ### so it will use the most recent command to updat the desired velocity ### # subscribing to the IR topics left_error, right_error, front_error left_ir_sub = rospy.Subscriber("left_error", Float32, left_ir_read) right_ir_sub = rospy.Subscriber("right_error", Float32, right_ir_read) front_ir_sub = rospy.Subscriber("front_error", Float32, front_ir_read) # subscribing to the motor speeds left_motor_sub = rospy.Subscriber("left_motor_speed", Float32, left_motor_speed_read) right_motor_sub = rospy.Subscriber("right_motor_speed", Float32, right_motor_speed_read) #subscribing to button topics left_thumb_sub = rospy.Subscriber("left_thumb", Float32, left_thumb_read) right_thumb_sub = rospy.Subscriber("right_thumb", Float32, right_thumb_read) right_bumper_sub = rospy.Subscriber("right_bumper", Int16, right_bumper_read) a_button_sub = rospy.Subscriber("a_button", Int16, a_button_read) r = rospy.Rate(10) while not rospy.is_shutdown(): t1 = t2 t2 = time.time() time_diff = t2- t1 print "loop time: %r" % (1.0/time_diff) # getting the desired velocity desired_velocity_() # publishing the desired command closed_loop_command() # publishing the desired velocities left_motor_desired_pub.publish(left_motor_speed_desired) right_motor_desired_pub.publish(right_motor_speed_desired) # publishing the desired efforts left_motor_pub.publish(left_motor_effort) right_motor_pub.publish(right_motor_effort) print "is_blocked %r" % is_blocked blocked_pub.publish(is_blocked) r.sleep()

57

if __name__ == '__main__': start()

58 red_bot_v1.ino: /* *red bot low level drivers */

#include #include #include #include #include #include #include void pan_servo_pos(const std_msgs::Int16& set_pan_servo); void tilt_servo_pos(const std_msgs::Int16& set_tilt_servo); void lm_effort(const std_msgs::Int16& left_motor_effort);//sets left motor effort void rm_effort(const std_msgs::Int16& right_motor_effort);//sets right motor effort void lm_increment(void);//increments the left motor encoder value void lm_start_read(void);//attaches the interrupts and starts the left motors encoder time void lm_end_read(void);//detaches the interrupts and stops the left motors encoder time void rm_increment(void);//increments the right motor encoder value void rm_start_read(void);//attaches the interrupts and starts the right motors encoder time void rm_end_read(void);//detaches the interrupts and stops the right motors encoder time

//node handle for the stuff ros::NodeHandle nh;

/**** VARIABLE FOR THE TIMER ****/ const int timer_pin = 2;

/**** VARIABLES FOR THE LEFT MOTOR ****/ /* PIN NUMBERS */ //enable pin for the left motor const int lm_enable_pin = 53; //IN A pin for the left motor const int lm_ina_pin = 52; //IN B pin for the left motor const int lm_inb_pin = 51; //PWM pin for the left motor const int lm_pwm_pin = 12; //CS pin for the left motor const int lm_cs_pin = 12; //pin for the left motor encoder A output const int lm_enca_pin = 20; //pin for the left motor encoder B output const int lm_encb_pin = 21;

59

//interrupt number for the left motor encoder A output const int lm_inta_pin = 3; //interrupt number for the left motor encoder B output const int lm_intb_pin = 2; //encoder counts for the left motor doing full quadrature volatile int lm_counts = 0; //time for the left encoder sampling start int lm_start_time = 0; //time for the left encoder sampling to end int lm_end_time = 0; //holds the new state of the left motor encoder volatile int lm_new_state = 0; //holds the old state of the left motor encoder volatile int lm_old_state = 0; //array to hold the value of the possible counts and a 0 if its bad volatile int lm_states[16] = {0,-1,1,0,1,0,0,-1,-1,0,0,1,0,1,-1,0};

/**** VARIABLES FOR THE RIGHT MOTOR ****/ /* PIN NUMBERS */ //enable pin for the right motor const int rm_enable_pin = 50; //IN A pin for the right motor const int rm_ina_pin = 49; //IN B pin for the right motor const int rm_inb_pin = 48; //PWM pin for the right motor const int rm_pwm_pin = 11; //CS pin for the right motor const int rm_cs_pin = 11; //pin for the right motor encoder A output const int rm_enca_pin = 18; //pin for the right motor encoder B output const int rm_encb_pin = 19; //interrpt number for the right motor encoder A output const int rm_inta_pin = 5; //interrupt number for the right motor encoder B output const int rm_intb_pin = 4; //encoder counts for the right motor doing full quadrature volatile int rm_counts = 0; //time for the right encoder sampling start int rm_start_time = 0; //time for the right encoder sampling to end int rm_end_time = 0; //holds the new state of the right motor encoder volatile int rm_new_state = 0; //holds the old state of the right motor encoder

60 volatile int rm_old_state = 0; //array to hold the value of the possible counts and a 0 if its bad volatile int rm_states[16] = {0,-1,1,0,1,0,0,-1,-1,0,0,1,0,1,-1,0};

/**** VARIABLES FOR THE FRONT IR SENSOR ****/ //analog pin for front IR sensor const int front_ir_pin = 15; //holds the ir read value unsigned int front_ir_read = 1000; //setting up the message to hold all the IR sensor distance value std_msgs::UInt16 front_ir_msg; ros::Publisher front_ir( "front_ir", &front_ir_msg);

/**** VARIABLES FOR THE RIGHT IR SENSOR ****/ //analog pin for front IR sensor const int right_ir_pin = 13; //holds the ir read value unsigned int right_ir_read = 1000; //setting up the message to hold all the IR sensor distance value std_msgs::UInt16 right_ir_msg; ros::Publisher right_ir( "right_ir", &right_ir_msg);

/**** VARIABLES FOR THE LEFT IR SENSOR ****/ //analog pin for front IR sensor const int left_ir_pin = 14; //holds the ir read value unsigned int left_ir_read = 1000; //setting up the message to hold all the IR sensor distance value std_msgs::UInt16 left_ir_msg; ros::Publisher left_ir( "left_ir", &left_ir_msg);

/**** SETTING UP THE PAN SERVO ****/ //servo Servo pan_servo; //setting up the message to subscribe to the desired pan servo angle and uses the function //change_servo_pos to change the servo position ros::Subscriber pan_servo_desired( "pan_servo_desired", pan_servo_pos); //setting up the message to hold pan servo current angle std_msgs::Int16 pan_servo_current_msg; ros::Publisher pan_servo_current( "pan_servo_current", &pan_servo_current_msg);

/**** SETTING UP THE TILT SERVO ****/ //servo Servo tilt_servo; //setting up the message to subscribe to the desired tilt servo angle and uses the function //change_servo_pos to change the servo position

61 ros::Subscriber tilt_servo_desired( "tilt_servo_desired", tilt_servo_pos); //setting up the message to hold tilt servo current angle std_msgs::Int16 tilt_servo_current_msg; ros::Publisher tilt_servo_current( "tilt_servo_current", &tilt_servo_current_msg);

/**** SETTING UP THE LEFT MOTOR ****/ //setting up the message to subscribe to the desired left motor effort and uses the function //lm_effort to change the effort level ros::Subscriber left_motor_effort( "left_motor_effort", lm_effort); //setting up the message to hold the left motor encoder sensor counts value for a run std_msgs::Int16 lm_encoder_count; ros::Publisher left_encoder_count( "left_encoder_count", &lm_encoder_count); //setting up the message to hold the left motor encoder sensor time value std_msgs::Int16 lm_encoder_time; ros::Publisher left_encoder_time( "left_encoder_time", &lm_encoder_time);

/**** SETTING UP THE RIGHT MOTOR ****/ //setting up the message to subscribe to the desired right motor effort and uses the function //rm_effort to change the effort level ros::Subscriber right_motor_effort( "right_motor_effort", rm_effort); //setting up the message to hold the right motor encoder sensor counts value for a run std_msgs::Int16 rm_encoder_count; ros::Publisher right_encoder_count( "right_encoder_count", &rm_encoder_count); //setting up the message to hold the right motor encoder sensor time value std_msgs::Int16 rm_encoder_time; ros::Publisher right_encoder_time( "right_encoder_time", &rm_encoder_time);

/******************************************************* function: lm_effort purpose: sets the effort and direction of the left motor. accepts number between -255 and 255 inputs: signed int outputs: none *******************************************************/ void lm_effort(const std_msgs::Int16& lm_effort_level) { //getting the value to be written to the motor int effort = lm_effort_level.data; // if the value is greater than 255 it will mean to either brake to vcc or gnd // otherwise will out the value if(effort > 255) { //brake to vcc if 256 otherwise brake to low if(effort == 256)

62

{ digitalWrite(lm_ina_pin, HIGH); digitalWrite(lm_inb_pin, HIGH); } else { digitalWrite(lm_ina_pin, LOW); digitalWrite(lm_inb_pin, LOW); } } else { //checking to see if want to spin motor clockwise or counter clockwise if(effort > 0) { //setting A to true and B to false to get counter clockwise spin digitalWrite(lm_ina_pin, LOW); digitalWrite(lm_inb_pin, HIGH); //setting the effort level analogWrite(lm_pwm_pin, effort+10); } else if (effort < 0) { //making the negative value positive effort = abs(effort); //setting A to false and B to true to get clockwise spin digitalWrite(lm_ina_pin, HIGH); digitalWrite(lm_inb_pin, LOW); //setting the effort level analogWrite(lm_pwm_pin, effort+10); } else { digitalWrite(lm_ina_pin, HIGH); digitalWrite(lm_inb_pin, HIGH); analogWrite(lm_pwm_pin, 0); } } }

/******************************************************* function: rm_effort purpose: sets the effort and direction of the left motor. accepts number between -255 and 255 inputs: signed int outputs: none

63

*******************************************************/ void rm_effort(const std_msgs::Int16& rm_effort_level) { //getting the value to be written to the motor int effort = rm_effort_level.data; // if the value is greater than 255 it will mean to either brake to vcc or gnd // otherwise will out the value if(effort > 255) { //brake to vcc if 256 otherwise brake to low if(effort == 256) { digitalWrite(rm_ina_pin, HIGH); digitalWrite(rm_inb_pin, HIGH); } else { digitalWrite(rm_ina_pin, LOW); digitalWrite(rm_inb_pin, LOW); } } else { //checking to see if want to spin motor clockwise or counter clockwise if(effort > 0) { //setting A to true and B to false to get counter clockwise spin digitalWrite(rm_ina_pin, HIGH); digitalWrite(rm_inb_pin, LOW); //setting the effort level analogWrite(rm_pwm_pin, effort+10); } else if(effort < 0) { //making the negative value positive effort = abs(effort); //setting A to true and B to false to get clockwise spin digitalWrite(rm_ina_pin, LOW); digitalWrite(rm_inb_pin, HIGH); //setting the effort level analogWrite(rm_pwm_pin, effort+10); } else { digitalWrite(rm_ina_pin, HIGH); digitalWrite(rm_inb_pin, HIGH);

64

analogWrite(rm_pwm_pin, 0); } } }

/******************************************************* function: lm_start_read purpose: starts left motor encoder interrupts and starts the left motor timer inputs: none outputs: none *******************************************************/ void lm_start_read(void) { //attaching the interrupts and having them call the increment function attachInterrupt(lm_inta_pin,lm_increment,CHANGE); attachInterrupt(lm_intb_pin,lm_increment,CHANGE); //starting the timer for the counts lm_start_time = micros(); //reset the counts lm_counts = 0; }

/******************************************************* function: lm_end_read purpose: ends left motor encoder interrupts and ends the left motor timer inputs: none outputs: none *******************************************************/ void lm_end_read(void) { //detaching the interrupts detachInterrupt(lm_inta_pin); detachInterrupt(lm_intb_pin); //end the timer for the counts lm_end_time = micros(); }

/******************************************************* function: lm_increment purpose: increments the counter for the left motor encoder. inputs: none outputs: none *******************************************************/ void lm_increment() { //setting the old state to the new

65

lm_old_state = lm_new_state; //getting the new state lm_new_state = digitalRead(lm_enca_pin)*2 + digitalRead(lm_encb_pin); //updating the count and subtracting instead of adding because left motor backward lm_counts = lm_counts - lm_states[(4*lm_old_state) + lm_new_state]; }

/******************************************************* function: rm_start_read purpose: starts right motor encoder interrupts and starts the right motor timer inputs: none outputs: none *******************************************************/ void rm_start_read(void) { //attaching the interrupts and having them call the increment function attachInterrupt(rm_inta_pin,rm_increment,CHANGE); attachInterrupt(rm_intb_pin,rm_increment,CHANGE); //starting the timer for the counts rm_start_time = micros(); //reset the counts rm_counts = 0; }

/******************************************************* function: rm_end_read purpose: ends right motor encoder interrupts and ends the right motor timer inputs: none outputs: none *******************************************************/ void rm_end_read(void) { //detaching the interrupts detachInterrupt(rm_inta_pin); detachInterrupt(rm_intb_pin); //end the timer for the counts rm_end_time = micros(); }

/******************************************************* function: rm_increment purpose: increments the counter for the right motor encoder. inputs: none outputs: none *******************************************************/ void rm_increment()

66

{ //setting the old state to the new state rm_old_state = rm_new_state; //getting the new state rm_new_state = digitalRead(rm_enca_pin)*2 + digitalRead(rm_encb_pin); //updating the count by adding rm_counts = rm_counts + rm_states[(4*rm_old_state) + rm_new_state]; }

/******************************************************* function: ir2mm purpose: reads the ir rangefinder and returns the value it reads in mm inputs: none outputs: unsigned int *******************************************************/ unsigned int ir2mm(unsigned int desired_ir_ranger) { //reading the IR the first time unsigned int read_value = analogRead(desired_ir_ranger); //converting the read value into millimeters return read_value; }

/******************************************************* function: pan_servo_pos purpose: changes the current position of the servo to read a new position based on what the last published value was at the node it subscibes to, set_pan_servo. Using special number to signify that the area was not large enough and it should go back to center inputs: unsigned int outputs: none *******************************************************/ void pan_servo_pos(const std_msgs::Int16& set_pan_servo) { int desired = 100; if(set_pan_servo.data<1000) { desired = pan_servo.read()+set_pan_servo.data; if(desired>=145) { desired = 145; } else if(desired<=55) { desired = 55;

67

} else { desired = desired; } } else { desired = 100; } //setting the servo position to the value that was in the data portion of the message pan_servo.write(desired); }

/******************************************************* function: tilt_servo_pos purpose: changes the current position of the servo to read a new position based on what the last published value was at the node it subscibes to, set_tilt_servo.Using special number to signify that the area was not large enough and it should go back to center inputs: unsigned int outputs: none *******************************************************/ void tilt_servo_pos(const std_msgs::Int16& set_tilt_servo) { int desired = 100; if(set_tilt_servo.data<1000) { desired = tilt_servo.read()+set_tilt_servo.data; if(desired>=130) { desired = 130; } else if(desired<=55) { desired = 55; } else { desired = desired; } } else { desired = 100; } //setting the servo position to the value that was in the data portion of the message

68

tilt_servo.write(desired); } void setup() { // erasing the current value on the timer counter 1 int resetscalar = 7; TCCR1B &= ~resetscalar; //new scaler value will be 4000Hz int prescaler = 2; TCCR1B |= prescaler;

//motor enable pins pinMode(lm_enable_pin,OUTPUT); pinMode(rm_enable_pin,OUTPUT); //motor direction pins pinMode(lm_ina_pin,OUTPUT); pinMode(lm_inb_pin,OUTPUT); pinMode(rm_ina_pin,OUTPUT); pinMode(rm_inb_pin,OUTPUT); //motor PWM pins pinMode(lm_pwm_pin,OUTPUT); pinMode(rm_pwm_pin,OUTPUT); //enabling the motors digitalWrite(lm_enable_pin,HIGH); digitalWrite(rm_enable_pin,HIGH); //motor encoder pins pinMode(lm_enca_pin,INPUT); pinMode(lm_encb_pin,INPUT); pinMode(rm_enca_pin,INPUT); pinMode(rm_encb_pin,INPUT);

//debugging loop led pinMode(timer_pin,OUTPUT); digitalWrite(timer_pin,LOW);

/**** SETTING UP THE NODES ****/ //initializing the node nh.initNode(); //initializing the front ir topic nh.advertise(front_ir); //initializing the right ir topic nh.advertise(right_ir); //initializing the left ir topic nh.advertise(left_ir); //initializing the left motor effort topic

69

nh.subscribe(left_motor_effort); //initializing the right motor effort topic nh.subscribe(right_motor_effort); //initializing the left motor encoder counter topic nh.advertise(left_encoder_count); //initializing the left motor encoder time topic nh.advertise(left_encoder_time); //initializing the right motor encoder counter topic nh.advertise(right_encoder_count); //initializing the right motor encoder time topic nh.advertise(right_encoder_time); //initializing the pan servo angle read nh.advertise(pan_servo_current); //initializing the servo topic nh.subscribe(pan_servo_desired); //initializing the tilt servo angle read nh.advertise(tilt_servo_current); //initializing the servo topic nh.subscribe(tilt_servo_desired);

/**** INITIALIZING THE SERVOS ****/ //attaching the servo pan_servo.attach(9); //setting the servo to 100 degree initially offset is 10 degree pan_servo.write(100);

//attaching the servo tilt_servo.attach(10); //setting the servo to 100 degree initially offset is 10 degree tilt_servo.write(100);

} void loop() { //turning on the led at the start of the loop digitalWrite(timer_pin,HIGH); //starting the encoders lm_start_read(); rm_start_read(); delayMicroseconds(10000); //ending the encoders lm_end_read(); rm_end_read();

//reading the ir values

70 front_ir_read = ir2mm(front_ir_pin); left_ir_read = ir2mm(left_ir_pin); right_ir_read = ir2mm(right_ir_pin);

//publishing the front, right, and left ir front_ir_msg.data = front_ir_read; front_ir.publish(&front_ir_msg); left_ir_msg.data = left_ir_read; left_ir.publish(&left_ir_msg); right_ir_msg.data = right_ir_read; right_ir.publish(&right_ir_msg);

//publishing the servo angle for the pan servo pan_servo_current_msg.data = pan_servo.read(); pan_servo_current.publish(&pan_servo_current_msg);

//publishing the servo angle for the tilt servo tilt_servo_current_msg.data = tilt_servo.read(); tilt_servo_current.publish(&tilt_servo_current_msg);

//publishing the time for the cycle lm_encoder_time.data = lm_end_time - lm_start_time; left_encoder_time.publish(&lm_encoder_time); rm_encoder_time.data = rm_end_time - rm_start_time; right_encoder_time.publish(&rm_encoder_time);

//publishing the counts for the cycle lm_encoder_count.data = lm_counts; left_encoder_count.publish(&lm_encoder_count); rm_encoder_count.data = rm_counts; right_encoder_count.publish(&rm_encoder_count);

//delaying the loop to 10Hz will adjust the first to include all the other code //first delay turns the led off for(volatile uint8_t waiting_on = 0;waiting_on<4;waiting_on++) { delayMicroseconds(10000); } //turing the debug led off digitalWrite(timer_pin,LOW); // the second delay waits to turn in back on at the start of the loop so will be about for(volatile uint8_t waiting_off = 0;waiting_off<5;waiting_off++) { delayMicroseconds(10000);

71

}

nh.spinOnce(); }

72 blue_bot.launch:

73 tele_motor_bluebot_1.py: #!/usr/bin/env python import rospy import time import numpy as np from random import random from std_msgs.msg import Int16 from std_msgs.msg import UInt16 from std_msgs.msg import Float32

# This ROS Node converts Joystick inputs from the joy node # into velocity commands for the motors. If the "A" button is pressed will be autonomous # otherwise will use the commands from the thumbsticks.

# declaring the global variables that hold values to be passed between functions left_ir_error_cmd = 0 right_ir_error_cmd = 0 front_ir_error_cmd = 0 left_motor_speed_desired = 0 right_motor_speed_desired = 0 left_motor_speed_act = 0 right_motor_speed_act = 0 left_motor_effort = 0 right_motor_effort = 0 left_thumb_value = 0 right_thumb_value = 0 a_button_value = 0 left_bumper_value = 0 is_blocked = False dice_roll = 0 lirs = False rirs = False firs = False rturn = False lturn = False t1 = 0 t2 = 0 time_diff = 0

# creating some arrays for the error terms

74 error_left = np.zeros((1,2)) error_left_derivative = np.zeros((1,3)) error_left_integral = np.zeros((1,5)) error_right = np.zeros((1,2)) error_right_derivative = np.zeros((1,3)) error_right_integral = np.zeros((1,5))

# function: desired_velocity # purpose: # 1) using the values being updated by the button topics and the IR topics calculates what the desired velocity # should be. Max velocity is set to 12 in/sec for now def desired_velocity_(): global left_ir_error_cmd global right_ir_error_cmd global front_ir_error_cmd

global left_motor_speed_desired global right_motor_speed_desired

global left_thumb_value global right_thumb_value global a_button_value global left_bumper_value

global is_blocked global dice_roll global lirs global rirs global firs global rturn global lturn

### teleop and autonomous velocity stuff ### # max desired velocity when using teleop mode in inches/sec v_teleop_max = 18 # max desited velocity when in autonomous mode v_auto_max = v_teleop_max # the front velocity error gain in autonomous mode KF = v_auto_max # turn gain KTG = 0.6 # the left and right velocity error gain in autonomous mode KL = v_auto_max * KTG KR = v_auto_max * KTG

75

### thumbstick values ### # the left thumb value velocity is the up or down value read from the controller ltv = v_teleop_max * left_thumb_value # the right thumb value velocity is the left or right value. Negative because the joy command negates rtv = (v_teleop_max*KTG) * right_thumb_value # the A button value is the value sent by the A button determining to be in autonomous or tele mode abv = a_button_value # the left bumber value will tell if it is desired to drive bot around bv = left_bumper_value

### temporary values and minimum velocity values # minimum velocity in/sec to turn on otherwise send 0 in/sec mv = 2 # left motor desired velocity temp lmv = 0 # right motor desired velocity temp rmv = 0

### checking the six possible thumbstick orientations and if A is pressed or not ### # left_thumb_up ltu = ltv > mv # left_thumb_down ltd = ltv < -mv # right_thumb_right rtr = rtv > mv # right_thumb_left rtl = rtv < -mv # left_thumb_off lto = (not ltu) and (not ltd) # right_thumb_off rto = (not rtr) and (not rtl) # A button pressed abp = abv > 0 # bumper pressed bp = bv > 0

### If any of the commands are above the threshold value want to entire the blocked procedure if (abp) and (not is_blocked): print "entered not is_blocked" ### checking if the ir commands are almost saturated for the left, right, and front IR values # left ir saturated lirs = left_ir_error_cmd < -0.99 # right ir saturated rirs = right_ir_error_cmd < -0.99

76

# front ir saturated firs = front_ir_error_cmd < -0.99 ### checking to see if any of the conditions for a problem are detected: # checking if all three are saturated allsat = lirs and rirs and firs # checking if the front is saturated but the sides are not saturated frsat = firs and not (lirs and rirs) # checking if the sides are saturated but not the front sisat = ( not firs ) and (lirs and rirs) # if any of the conditions are true will enter the blocked procedure and roll the dice to # determine the direction to turn out of the block if allsat or frsat or sisat: print "\tentered the is_blocked change spot" is_blocked = True # rolling the dice to determine which way to turn if greater than 50 turn right otherwise # turn left dice_roll = random() * 100 print "\t\tdice %r" % dice_roll # not using an elif because want this to happen immediately if the above is true # if the a button is pressed and the robot is blocked want to enter the check to see if # either of the two turn conditions are true. If they are not true will se the blocked # boolean back to False if abp and is_blocked: print "\t\t\tentered the next stage" ### checking if the ir commands are almost saturated for the left, right, and front IR values # left ir saturated lirs = left_ir_error_cmd < -0.75 # right ir saturated rirs = right_ir_error_cmd < -0.75 # front ir saturated firs = front_ir_error_cmd < -0.75

# checking the two possible block conditions if (rirs or firs) and (dice_roll > 50): print "\t\t\t\tentered the right turn" rturn = True lturn = False elif (lirs or firs) and (dice_roll <= 50): print "\t\t\t\tentered the left turn" lturn = True rturn = False else: print "\t\t\t\tentered the make false" rturn = False lturn = False is_blocked = False

77

# checking if the A button is pressed will go into autonomous mode to determine the desired # velocity otherwise will use teleop mode to determine the desired velocity if abp: # if the robot is blocked and a right turn was chosen then turning right at max turn # if the robot is blocked and a left turn was chosen then turning left at max turn # otherwise operating in normal autonomous mode if is_blocked and rturn: lmv = KL rmv = -1.0 * KR elif is_blocked and lturn: lmv = -1.0 * KL rmv = KR else: # calculate the desired velocity values lmv = v_auto_max - KL*left_ir_error_cmd + KR * right_ir_error_cmd + KF * front_ir_error_cmd rmv = v_auto_max + KL*left_ir_error_cmd - KR * right_ir_error_cmd + KF * front_ir_error_cmd

if (abs(lmv) < mv) and (abs(rmv) < mv): lmv = 0 rmv = 0 else: lmv = lmv rmv = rmv elif bp: # want to disable the block if I release the A button is_blocked = False # go forward straight: increase both positively # go backward straight: increase bot negatively if (ltu and rto) or (ltd and rto): lmv = ltv rmv = ltv # go forward and right: increase left motor positively and decrease right motor positively # go forward and left: increase right motor positively and decrease left motor positively elif (ltu and rtr) or (ltu and rtl): lmv = ltv + rtv rmv = ltv - rtv # go backward and left: increase right motor negatively and decrease left motor negatively # go backward and right: decrease right motor negatively and increase left motor negatively elif (ltd and rtl) or (ltd and rtr): lmv = ltv - rtv rmv = ltv + rtv # rotate right in place: increase left motor positively and increase right motor negatively

78

# rotate left in place: increase left motor negatively and increase right motor positively elif (lto and rtr) or (lto and rtl): lmv = rtv rmv = -rtv else: lmv = 0 rmv = 0 else: # want to disable the block if I release the A button is_blocked = False lmv = 0 rmv = 0 left_motor_speed_desired = lmv right_motor_speed_desired = rmv

# function: closed_loop_command # purpose: # 1) The loop will be closed using the calculated linear velocity published on the topics # "right_motor_speed" and "left_motor_speed". # 2) This error signal will pass through the controller which currently will just be a proportional # controller with gain KP def closed_loop_command(): # global variables global left_motor_speed_desired global right_motor_speed_desired global left_motor_speed_act global right_motor_speed_act global left_motor_effort global right_motor_effort

global time_diff

global error_left global error_left_derivative global error_left_integral global error_right global error_right_derivative global error_right_integral

# max output to divide command over max_output = 255

# current controller is just a proportional controller use KCP of 0.01 for now # gains for hard surface #KCP = 0.012 #KCD = 0.00008

79

#KCI = 0 # gains for carpet KCP = 0.015 KCD = 0.0 KCI = 0.0 # getting the previous mean for the derivative calc previous_mean_right = np.mean(error_right,axis=1) previous_mean_left = np.mean(error_left,axis=1)

# temporary variables for the error left_motor_error = left_motor_speed_desired - left_motor_speed_act right_motor_error = right_motor_speed_desired - right_motor_speed_act

# shifting all the arrays one value to the left # if it is less than the length-1 of the arrays will update for index_of in xrange(0,error_left.shape[1]-1): error_left[0,index_of] = error_left[0,index_of+1] error_right[0,index_of] = error_right[0,index_of+1]

for index_of in xrange(0,error_left_derivative.shape[1]-1): error_left_derivative[0,index_of] = error_left_derivative[0,index_of+1] error_right_derivative[0,index_of] = error_right_derivative[0,index_of+1]

for index_of in xrange(0,error_left_integral.shape[1]-1): error_left_integral[0,index_of] = error_left_integral[0,index_of+1] error_right_integral[0,index_of] = error_right_integral[0,index_of+1]

# updating the last value for the error error_left[0,error_left.shape[1]-1] = left_motor_error error_right[0,error_right.shape[1]-1] = right_motor_error

# updating the last value for the derivative error_left_derivative[0,error_left_derivative.shape[1]-1] = (np.mean(error_left,axis=1)- previous_mean_left)/time_diff error_right_derivative[0,error_right_derivative.shape[1]-1] = (np.mean(error_right,axis=1)- previous_mean_right)/time_diff

# updating the last value for the integral error_left_integral[0,error_left_integral.shape[1]-1] = np.mean(error_left,axis=1) error_right_integral[0,error_right_integral.shape[1]-1] = np.mean(error_right,axis=1)

# updating the desired left and right commands left_motor_effort = KCP*np.mean(error_left,axis=1) + KCD*np.mean(error_left_derivative,axis=1) + KCI*np.mean(np.trapz(error_left_integral,dx=time_diff,axis=1))

80

right_motor_effort = KCP*np.mean(error_right,axis=1) + KCD*np.mean(error_right_derivative,axis=1) + KCI*np.mean(np.trapz(error_right_integral,dx=time_diff,axis=1))

# dividing over the available output left_motor_effort = left_motor_effort * max_output right_motor_effort = right_motor_effort * max_output

#### checking the cases for the effort # left motor saturated positively lmsp = left_motor_effort > max_output

# left motor saturated negatively lmsn = left_motor_error < (-1 * max_output)

# right motor saturated positively rmsp = right_motor_effort > max_output

# right motor saturated negatively rmsn = right_motor_error < (-1 * max_output)

# checking if the left motor is saturated and coercing it if it is if lmsp: left_motor_effort = max_output elif lmsn: left_motor_effort = (-1 * max_output) else: left_motor_effort = left_motor_effort

# checking if the left motor is saturated and coercing it if it is if rmsp: right_motor_effort = max_output elif rmsn: right_motor_effort = (-1 * max_output) else: right_motor_effort = right_motor_effort

# function: right_motor_speed_read # purpose: # 1) read "right_motor_speed" topic def right_motor_speed_read(data): # reading and converting the value global right_motor_speed_act right_motor_speed_act = data.data

81

# function: left_motor_speed_read # purpose: # 1) read "left_motor_speed" topic def left_motor_speed_read(data): # reading and converting the value global left_motor_speed_act left_motor_speed_act = data.data

# function: front_ir_read # purpose: # 1) read "front_ir" topic def front_ir_read(data): # reading and converting the value global front_ir_error_cmd front_ir_error_cmd = data.data

# function: left_ir_read # purpose: # 1) read "left_ir" topic def left_ir_read(data): # reading and converting the value global left_ir_error_cmd left_ir_error_cmd = data.data

# function: right_ir_read # purpose: # 1) read "right_ir" topic def right_ir_read(data): # reading and converting the value global right_ir_error_cmd right_ir_error_cmd = data.data

# function: right_thumb_read # purpose: # 1) Read "right_thumb" topic def right_thumb_read(data): global right_thumb_value right_thumb_value = data.data

# function: left_thumb_read # purpose: # 1) Read "left_thumb" topic def left_thumb_read(data): global left_thumb_value left_thumb_value = data.data

82

# function: left_bumper_read # purpose: # 1) Read "left_bumper" topic def left_bumper_read(data): global left_bumper_value left_bumper_value = data.data

# function: a_button_read # purpose: # 1) Read "a_button" topic def a_button_read(data): global a_button_value a_button_value = data.data

# Intializes everything def start(): ### declaring all the sensor global variables ### global left_ir_error_cmd global right_ir_error_cmd global front_ir_error_cmd global left_motor_speed_desired global right_motor_speed_desired global left_motor_speed_act global right_motor_speed_act global left_motor_effort global right_motor_effort global left_thumb_value global right_thumb_value global a_button_value global left_bumper_value global is_blocked ### declaring all the publusher and subscribers ### global left_motor_pub global right_motor_pub

global left_motor_desired_pub global right_motor_desired_pub

global left_thumb_sub global right_thumb_sub global a_button_sub global left_bumper_sub

global blocked_pub global left_ir_sub global right_ir_sub

83 global fornt_ir_sub global left_motor_sub global right_motor_sub global time_diff global t1 global t2

### initializing the publishers for the two motors ### left_motor_pub = rospy.Publisher('left_motor_effort', Int16, queue_size=1) right_motor_pub = rospy.Publisher('right_motor_effort', Int16, queue_size=1) left_motor_desired_pub = rospy.Publisher('left_motor_desired', Int16, queue_size=1) right_motor_desired_pub = rospy.Publisher('right_motor_desired', Int16, queue_size=1) blocked_pub = rospy.Publisher('blocked',Int16,queue_size=1) ### starting the node rospy.init_node('motor_commands')

### getting the values from the other programs. Start all the other topics before the joystick ### ### so it will use the most recent command to updat the desired velocity ### # subscribing to the IR topics left_error, right_error, front_error left_ir_sub = rospy.Subscriber("left_error", Float32, left_ir_read) right_ir_sub = rospy.Subscriber("right_error", Float32, right_ir_read) front_ir_sub = rospy.Subscriber("front_error", Float32, front_ir_read) # subscribing to the motor speeds left_motor_sub = rospy.Subscriber("left_motor_speed", Float32, left_motor_speed_read) right_motor_sub = rospy.Subscriber("right_motor_speed", Float32, right_motor_speed_read) #subscribing to button topics left_thumb_sub = rospy.Subscriber("left_thumb", Float32, left_thumb_read) right_thumb_sub = rospy.Subscriber("right_thumb", Float32, right_thumb_read) left_bumper_sub = rospy.Subscriber("left_bumper", Int16, left_bumper_read) a_button_sub = rospy.Subscriber("a_button", Int16, a_button_read) r = rospy.Rate(10) while not rospy.is_shutdown(): t1 = t2 t2 = time.time() time_diff = t2- t1 print "loop time: %r" % (1.0/time_diff) # getting the desired velocity desired_velocity_() # publishing the desired command closed_loop_command() # publishing the desired velocities left_motor_desired_pub.publish(left_motor_speed_desired)

84

right_motor_desired_pub.publish(right_motor_speed_desired) # publishing the desired efforts left_motor_pub.publish(left_motor_effort) right_motor_pub.publish(right_motor_effort) print "is_blocked %r" % is_blocked blocked_pub.publish(is_blocked) r.sleep() if __name__ == '__main__': start()

85 blue_bot_v1.ino: /* *blue bot low level drivers */

#include #include #include #include #include #include #include void lm_effort(const std_msgs::Int16& left_motor_effort);//sets left motor effort void rm_effort(const std_msgs::Int16& right_motor_effort);//sets right motor effort void lm_increment(void);//increments the left motor encoder value void lm_start_read(void);//attaches the interrupts and starts the left motors encoder time void lm_end_read(void);//detaches the interrupts and stops the left motors encoder time void rm_increment(void);//increments the right motor encoder value void rm_start_read(void);//attaches the interrupts and starts the right motors encoder time void rm_end_read(void);//detaches the interrupts and stops the right motors encoder time

//node handle for the stuff ros::NodeHandle nh;

/**** VARIABLE FOR THE TIMER ****/ const int timer_pin = 2;

/**** VARIABLES FOR THE LEFT MOTOR ****/ /* PIN NUMBERS */ //enable pin for the left motor const int lm_enable_pin = 53; //IN A pin for the left motor const int lm_ina_pin = 52; //IN B pin for the left motor const int lm_inb_pin = 51; //PWM pin for the left motor const int lm_pwm_pin = 12; //CS pin for the left motor const int lm_cs_pin = 12; //pin for the left motor encoder A output const int lm_enca_pin = 20; //pin for the left motor encoder B output const int lm_encb_pin = 21; //interrupt number for the left motor encoder A output const int lm_inta_pin = 3;

86

//interrupt number for the left motor encoder B output const int lm_intb_pin = 2; //encoder counts for the left motor doing full quadrature volatile int lm_counts = 0; //time for the left encoder sampling start int lm_start_time = 0; //time for the left encoder sampling to end int lm_end_time = 0; //holds the new state of the left motor encoder volatile int lm_new_state = 0; //holds the old state of the left motor encoder volatile int lm_old_state = 0; //array to hold the value of the possible counts and a 0 if its bad volatile int lm_states[16] = {0,-1,1,0,1,0,0,-1,-1,0,0,1,0,1,-1,0};

/**** VARIABLES FOR THE RIGHT MOTOR ****/ /* PIN NUMBERS */ //enable pin for the right motor const int rm_enable_pin = 50; //IN A pin for the right motor const int rm_ina_pin = 49; //IN B pin for the right motor const int rm_inb_pin = 48; //PWM pin for the right motor const int rm_pwm_pin = 11; //CS pin for the right motor const int rm_cs_pin = 11; //pin for the right motor encoder A output const int rm_enca_pin = 18; //pin for the right motor encoder B output const int rm_encb_pin = 19; //interrpt number for the right motor encoder A output const int rm_inta_pin = 5; //interrupt number for the right motor encoder B output const int rm_intb_pin = 4; //encoder counts for the right motor doing full quadrature volatile int rm_counts = 0; //time for the right encoder sampling start int rm_start_time = 0; //time for the right encoder sampling to end int rm_end_time = 0; //holds the new state of the right motor encoder volatile int rm_new_state = 0; //holds the old state of the right motor encoder volatile int rm_old_state = 0; //array to hold the value of the possible counts and a 0 if its bad

87 volatile int rm_states[16] = {0,-1,1,0,1,0,0,-1,-1,0,0,1,0,1,-1,0};

/**** VARIABLES FOR THE FRONT IR SENSOR ****/ //analog pin for front IR sensor const int front_ir_pin = 15; //holds the ir read value unsigned int front_ir_read = 1000; //setting up the message to hold all the IR sensor distance value std_msgs::UInt16 front_ir_msg; ros::Publisher front_ir( "front_ir", &front_ir_msg);

/**** VARIABLES FOR THE RIGHT IR SENSOR ****/ //analog pin for front IR sensor const int right_ir_pin = 13; //holds the ir read value unsigned int right_ir_read = 1000; //setting up the message to hold all the IR sensor distance value std_msgs::UInt16 right_ir_msg; ros::Publisher right_ir( "right_ir", &right_ir_msg);

/**** VARIABLES FOR THE LEFT IR SENSOR ****/ //analog pin for front IR sensor const int left_ir_pin = 14; //holds the ir read value unsigned int left_ir_read = 1000; //setting up the message to hold all the IR sensor distance value std_msgs::UInt16 left_ir_msg; ros::Publisher left_ir( "left_ir", &left_ir_msg);

/**** SETTING UP THE LEFT MOTOR ****/ //setting up the message to subscribe to the desired left motor effort and uses the function //lm_effort to change the effort level ros::Subscriber left_motor_effort( "left_motor_effort", lm_effort); //setting up the message to hold the left motor encoder sensor counts value for a run std_msgs::Int16 lm_encoder_count; ros::Publisher left_encoder_count( "left_encoder_count", &lm_encoder_count); //setting up the message to hold the left motor encoder sensor time value std_msgs::Int16 lm_encoder_time; ros::Publisher left_encoder_time( "left_encoder_time", &lm_encoder_time);

/**** SETTING UP THE RIGHT MOTOR ****/ //setting up the message to subscribe to the desired right motor effort and uses the function //rm_effort to change the effort level ros::Subscriber right_motor_effort( "right_motor_effort", rm_effort); //setting up the message to hold the right motor encoder sensor counts value for a run

88 std_msgs::Int16 rm_encoder_count; ros::Publisher right_encoder_count( "right_encoder_count", &rm_encoder_count); //setting up the message to hold the right motor encoder sensor time value std_msgs::Int16 rm_encoder_time; ros::Publisher right_encoder_time( "right_encoder_time", &rm_encoder_time);

/******************************************************* function: lm_effort purpose: sets the effort and direction of the left motor. accepts number between -255 and 255 inputs: signed int outputs: none *******************************************************/ void lm_effort(const std_msgs::Int16& lm_effort_level) { //getting the value to be written to the motor int effort = lm_effort_level.data; // if the value is greater than 255 it will mean to either brake to vcc or gnd // otherwise will out the value if(effort > 255) { //brake to vcc if 256 otherwise brake to low if(effort == 256) { digitalWrite(lm_ina_pin, HIGH); digitalWrite(lm_inb_pin, HIGH); } else { digitalWrite(lm_ina_pin, LOW); digitalWrite(lm_inb_pin, LOW); } } else { //checking to see if want to spin motor clockwise or counter clockwise if(effort > 0) { //setting A to true and B to false to get counter clockwise spin digitalWrite(lm_ina_pin, LOW); digitalWrite(lm_inb_pin, HIGH); //setting the effort level analogWrite(lm_pwm_pin, effort+10); } else if (effort < 0) {

89

//making the negative value positive effort = abs(effort); //setting A to false and B to true to get clockwise spin digitalWrite(lm_ina_pin, HIGH); digitalWrite(lm_inb_pin, LOW); //setting the effort level analogWrite(lm_pwm_pin, effort+10); } else { digitalWrite(lm_ina_pin, HIGH); digitalWrite(lm_inb_pin, HIGH); analogWrite(lm_pwm_pin, 0); } } }

/******************************************************* function: rm_effort purpose: sets the effort and direction of the left motor. accepts number between -255 and 255 inputs: signed int outputs: none *******************************************************/ void rm_effort(const std_msgs::Int16& rm_effort_level) { //getting the value to be written to the motor int effort = rm_effort_level.data; // if the value is greater than 255 it will mean to either brake to vcc or gnd // otherwise will out the value if(effort > 255) { //brake to vcc if 256 otherwise brake to low if(effort == 256) { digitalWrite(rm_ina_pin, HIGH); digitalWrite(rm_inb_pin, HIGH); } else { digitalWrite(rm_ina_pin, LOW); digitalWrite(rm_inb_pin, LOW); } } else {

90

//checking to see if want to spin motor clockwise or counter clockwise if(effort > 0) { //setting A to true and B to false to get counter clockwise spin digitalWrite(rm_ina_pin, HIGH); digitalWrite(rm_inb_pin, LOW); //setting the effort level analogWrite(rm_pwm_pin, effort+10); } else if(effort < 0) { //making the negative value positive effort = abs(effort); //setting A to true and B to false to get clockwise spin digitalWrite(rm_ina_pin, LOW); digitalWrite(rm_inb_pin, HIGH); //setting the effort level analogWrite(rm_pwm_pin, effort+10); } else { digitalWrite(rm_ina_pin, HIGH); digitalWrite(rm_inb_pin, HIGH); analogWrite(rm_pwm_pin, 0); } } }

/******************************************************* function: lm_start_read purpose: starts left motor encoder interrupts and starts the left motor timer inputs: none outputs: none *******************************************************/ void lm_start_read(void) { //attaching the interrupts and having them call the increment function attachInterrupt(lm_inta_pin,lm_increment,CHANGE); attachInterrupt(lm_intb_pin,lm_increment,CHANGE); //starting the timer for the counts lm_start_time = micros(); //reset the counts lm_counts = 0; }

/*******************************************************

91 function: lm_end_read purpose: ends left motor encoder interrupts and ends the left motor timer inputs: none outputs: none *******************************************************/ void lm_end_read(void) { //detaching the interrupts detachInterrupt(lm_inta_pin); detachInterrupt(lm_intb_pin); //end the timer for the counts lm_end_time = micros(); }

/******************************************************* function: lm_increment purpose: increments the counter for the left motor encoder. inputs: none outputs: none *******************************************************/ void lm_increment() { //setting the old state to the new lm_old_state = lm_new_state; //getting the new state lm_new_state = digitalRead(lm_enca_pin)*2 + digitalRead(lm_encb_pin); //updating the count and subtracting instead of adding because left motor backward lm_counts = lm_counts - lm_states[(4*lm_old_state) + lm_new_state]; }

/******************************************************* function: rm_start_read purpose: starts right motor encoder interrupts and starts the right motor timer inputs: none outputs: none *******************************************************/ void rm_start_read(void) { //attaching the interrupts and having them call the increment function attachInterrupt(rm_inta_pin,rm_increment,CHANGE); attachInterrupt(rm_intb_pin,rm_increment,CHANGE); //starting the timer for the counts rm_start_time = micros(); //reset the counts rm_counts = 0; }

92

/******************************************************* function: rm_end_read purpose: ends right motor encoder interrupts and ends the right motor timer inputs: none outputs: none *******************************************************/ void rm_end_read(void) { //detaching the interrupts detachInterrupt(rm_inta_pin); detachInterrupt(rm_intb_pin); //end the timer for the counts rm_end_time = micros(); }

/******************************************************* function: rm_increment purpose: increments the counter for the right motor encoder. inputs: none outputs: none *******************************************************/ void rm_increment() { //setting the old state to the new state rm_old_state = rm_new_state; //getting the new state rm_new_state = digitalRead(rm_enca_pin)*2 + digitalRead(rm_encb_pin); //updating the count by adding rm_counts = rm_counts + rm_states[(4*rm_old_state) + rm_new_state]; }

/******************************************************* function: ir2mm purpose: reads the ir rangefinder and returns the value it reads in mm inputs: none outputs: unsigned int *******************************************************/ unsigned int ir2mm(unsigned int desired_ir_ranger) { //reading the IR the first time unsigned int read_value = analogRead(desired_ir_ranger); //converting the read value into millimeters return read_value;

93

}

void setup() { //setting the pwm frequency for the motor pins 11 and 12 int resetscalar = 7; TCCR1B &= ~resetscalar; //new scalar value will be 4000Hz int prescalar = 2; TCCR1B |= prescalar; //motor enable pins pinMode(lm_enable_pin,OUTPUT); pinMode(rm_enable_pin,OUTPUT); //motor direction pins pinMode(lm_ina_pin,OUTPUT); pinMode(lm_inb_pin,OUTPUT); pinMode(rm_ina_pin,OUTPUT); pinMode(rm_inb_pin,OUTPUT); //motor PWM pins pinMode(lm_pwm_pin,OUTPUT); pinMode(rm_pwm_pin,OUTPUT); //enabling the motors digitalWrite(lm_enable_pin,HIGH); digitalWrite(rm_enable_pin,HIGH); //motor encoder pins pinMode(lm_enca_pin,INPUT); pinMode(lm_encb_pin,INPUT); pinMode(rm_enca_pin,INPUT); pinMode(rm_encb_pin,INPUT);

//debugging loop led pinMode(timer_pin,OUTPUT); digitalWrite(timer_pin,LOW);

/**** SETTING UP THE NODES ****/ //initializing the node nh.initNode(); //initializing the front ir topic nh.advertise(front_ir); //initializing the right ir topic nh.advertise(right_ir); //initializing the left ir topic nh.advertise(left_ir); //initializing the left motor effort topic nh.subscribe(left_motor_effort);

94

//initializing the right motor effort topic nh.subscribe(right_motor_effort); //initializing the left motor encoder counter topic nh.advertise(left_encoder_count); //initializing the left motor encoder time topic nh.advertise(left_encoder_time); //initializing the right motor encoder counter topic nh.advertise(right_encoder_count); //initializing the right motor encoder time topic nh.advertise(right_encoder_time);

} void loop() { //turning on the led at the start of the loop digitalWrite(timer_pin,HIGH);

//starting the encoders lm_start_read(); rm_start_read(); delayMicroseconds(10000); //ending the encoders lm_end_read(); rm_end_read();

//reading the ir values front_ir_read = ir2mm(front_ir_pin); left_ir_read = ir2mm(left_ir_pin); right_ir_read = ir2mm(right_ir_pin);

//publishing the front, right, and left ir front_ir_msg.data = front_ir_read; front_ir.publish(&front_ir_msg);

left_ir_msg.data = left_ir_read; left_ir.publish(&left_ir_msg);

right_ir_msg.data = right_ir_read; right_ir.publish(&right_ir_msg);

//publishing the time for the cycle lm_encoder_time.data = lm_end_time - lm_start_time; left_encoder_time.publish(&lm_encoder_time); rm_encoder_time.data = rm_end_time - rm_start_time; right_encoder_time.publish(&rm_encoder_time);

95

//publishing the counts for the cycle lm_encoder_count.data = lm_counts; left_encoder_count.publish(&lm_encoder_count); rm_encoder_count.data = rm_counts; right_encoder_count.publish(&rm_encoder_count);

//delaying the loop to 10Hz will adjust the first to include all the other code //first delay turns the led off for(volatile uint8_t waiting_on = 0;waiting_on<4;waiting_on++) { delayMicroseconds(10000); } //turing the debug led off digitalWrite(timer_pin,LOW); // the second delay waits to turn in back on at the start of the loop so will be about for(volatile uint8_t waiting_off = 0;waiting_off<5;waiting_off++) { delayMicroseconds(10000); }

nh.spinOnce(); }

96 controller_splitter.py: #!/usr/bin/env python import rospy from std_msgs.msg import Int16 from std_msgs.msg import Float32 from sensor_msgs.msg import Joy

# This ROS Node converts Joystick inputs from the joy node and publishes them on topics for red and blue bot

# declaring the global variables that hold values to be passed between functions blue_left_thumb_value = 0 blue_right_thumb_value = 0 red_left_thumb_value = 0 red_right_thumb_value = 0 red_a_button_state = 0 blue_a_button_state = 0 blue_bumper_state = 0 red_bumper_state = 0

# function: controller_read # purpose: Read in joystick values from the "joy" topic def controller_read(data): ### declaring the global values for the button values to publish global blue_left_thumb_value global blue_right_thumb_value global red_left_thumb_value global red_right_thumb_value global red_a_button_state global blue_a_button_state global blue_bumper_state global red_bumper_state ### thumbstick values ### blue_left_thumb_value = data.axes[1] blue_right_thumb_value = -1.0*data.axes[2] red_left_thumb_value = blue_left_thumb_value red_right_thumb_value = blue_right_thumb_value red_a_button_state = data.buttons[0] blue_a_button_state = red_a_button_state blue_bumper_state = data.buttons[4] red_bumper_state = data.buttons[5]

# function: start # purpose: Intializes everything and cycles the publishers publishing the new controller values at a set rate def start():

97

### declaring all the sensor global variables ### global blue_left_thumb_value global blue_right_thumb_value global red_left_thumb_value global red_right_thumb_value global red_a_button_state global blue_a_button_state global blue_bumper_state global red_bumper_state ### declaring all the publusher and subscribers ### global blue_bumper_pub global red_bumper_pub global blue_a_pub global red_a_pub global blue_left_thumb_pub global red_left_thumb_pub global blue_right_thumb_pub global red_right_thumb_pub

### initializing the publishers for the two motors ### blue_bumper_pub = rospy.Publisher('/blue_bot/left_bumper', Int16, queue_size=1) red_bumper_pub = rospy.Publisher('/red_bot/right_bumper', Int16, queue_size=1) blue_a_pub = rospy.Publisher('/blue_bot/a_button', Int16, queue_size=1) red_a_pub = rospy.Publisher('/red_bot/a_button', Int16, queue_size=1) blue_left_thumb_pub = rospy.Publisher('/blue_bot/left_thumb', Float32, queue_size=1) blue_right_thumb_pub = rospy.Publisher('/blue_bot/right_thumb', Float32, queue_size=1) red_left_thumb_pub = rospy.Publisher('/red_bot/left_thumb', Float32, queue_size=1) red_right_thumb_pub = rospy.Publisher('/red_bot/right_thumb', Float32, queue_size=1)

### starting the node rospy.init_node('controller_node')

### getting the values from the other programs. Start all the other topics before the joystick ### ### so it will use the most recent command to updat the desired velocity ### # subscribing to the joystick inputs on topic "joy" to get inputs from the controller joy_sub = rospy.Subscriber("joy", Joy, controller_read) r = rospy.Rate(20) while not rospy.is_shutdown(): # publishing the buttons thumbstick value blue_bumper_pub.publish(blue_bumper_state) blue_a_pub.publish(blue_a_button_state) blue_left_thumb_pub.publish(blue_left_thumb_value) blue_right_thumb_pub.publish(blue_right_thumb_value)

98

red_bumper_pub.publish(red_bumper_state) red_a_pub.publish(red_a_button_state) red_left_thumb_pub.publish(red_left_thumb_value) red_right_thumb_pub.publish(red_right_thumb_value)

r.sleep() if __name__ == '__main__': start()

99 map_display_2.py: #!/usr/bin/env python # its called pinta import rospy import sys import cv2 import cv2.cv as cv import cv from sensor_msgs.msg import Image, CameraInfo from cv_bridge import CvBridge, CvBridgeError import numpy as np import time from numpy import nonzero from math import degrees from math import radians from math import cos from math import sin from std_msgs.msg import Int16 from std_msgs.msg import Int32 from std_msgs.msg import Float32 import sys import roslib roslib.load_manifest('trying_again') mapx = 0.0 mapy = 0.0 distance = 0.0 angle = 0.0 a_button_value = 0 def updateangle(angle_new): global angle # updating angle angle = angle_new.data def updatemapvalues(distance_new): global distance # updating map distance = distance_new.data

# function: a_button_read # purpose: # 1) Read "a_button" topic def a_button_read(data): global a_button_value a_button_value = data.data

100

def start(): global mapx global mapy global distance global angle global a_button_value

### starting the node rospy.init_node('map_update') # Subscribe to the camera image and depth topics and set # the appropriate callbacks mapupdate_sub = rospy.Subscriber("distance", Float32, updatemapvalues, queue_size=1) panupdate_sub = rospy.Subscriber("pan_servo_current", Int16, updateangle, queue_size=1) a_button_sub = rospy.Subscriber("a_button", Int16, a_button_read,queue_size=1) # arrays for the distance and angle values average angle_mean = np.zeros((1,3)) distance_mean = np.zeros((1,3)) # kernel for the eroding and dialation routines kernel = cv2.getStructuringElement(cv2.MORPH_ELLIPSE,(5,5)) # loop rate r = rospy.Rate(60) # index for the number of runs index_runs = 0 # index for the multiple to clean up intermediately n = 1 while not rospy.is_shutdown():

# loading the frame frame = cv2.cvtColor(cv2.imread('/home/zack/v1_robotws/src/trying_again/map_19.jpg'),cv2.COLOR_B GR2GRAY) # if the number of runs has gone beyond some threshold then begin to do image processing to clena # up the map cleans it every minute if (index_runs > n*3600): print "cleaning" # applying a threshold to the image to remove noise and clean it up ret,thresh = cv2.threshold(frame,30,255,cv2.THRESH_BINARY) # closing to fill in gaps faster mask = cv2.morphologyEx(thresh, cv2.MORPH_CLOSE, kernel) # applying the mask to the image # Bitwise-AND to get the masked image in color but with the pixels removed by the mask gone filtered_image = cv2.bitwise_and(frame,frame, mask = mask) # updating the frame to be the filtered image

101

frame = filtered_image # updating the iteration n = n + 1 print index_runs #writing dots to show locations on temporary image with_dots = cv2.cvtColor(frame,cv2.COLOR_GRAY2BGR)

# getting the new averages for index_of in xrange(0,distance_mean.shape[1]-1): angle_mean[0,index_of] = angle_mean[0,index_of+1] distance_mean[0,index_of] = distance_mean[0,index_of+1] angle_mean[0,angle_mean.shape[1]-1] = angle distance_mean[0,distance_mean.shape[1]-1] = distance

# each pixel is an inch mapx = np.mean(distance_mean,axis=1)/25.4 * cos(radians(float(145- np.mean(angle_mean,axis=1)))) mapy = np.mean(distance_mean,axis=1)/25.4 * sin(radians(float(145- np.mean(angle_mean,axis=1))))

# new depth value value mapx = int(mapx) mapy = int(mapy)

if mapx>(frame.shape[1]-1): mapx = frame.shape[1]-1 elif mapx<0: mapx = 0 if mapy>(frame.shape[0]-1): mapy = frame.shape[0]-1 elif mapy<0: mapy = 0

# correcting the image to draw from the bottom right mapx = frame.shape[1]-1 - mapx mapy = frame.shape[0]-1 - mapy

# print "new_depth:%r" % new_depth print "\nmapy,mapx: %r,\t%r" % (mapy,mapx) previous_depth = frame.item(mapy,mapx) # print "previous_depth: %r" % previous_depth new_depth = previous_depth + 25

# if the new value is greater than 255 then set it to 255 if new_depth >=255: new_depth = 255

102

else: new_depth = new_depth # writing the values if a is pressed if (a_button_value > 0): frame.itemset((mapy,mapx),new_depth) print "\twrote cx,cy = %r,%r " % (mapx,mapy) #updating the index index_runs = index_runs + 1

# print "new frame value %r" % frame[mapx,mapy] cv2.circle(with_dots,(mapx,mapy),2,(255,0,0),-1) cv2.circle(with_dots,(frame.shape[1]-1,frame.shape[0]-1),6,(0,0,255),-1) # save image with dots cv2.imwrite('/home/zack/v1_robotws/src/trying_again/temp_map.jpg',with_dots) #Save image cv2.imwrite('/home/zack/v1_robotws/src/trying_again/map_19.jpg', frame) r.sleep()

if __name__ == '__main__': start()

103 map_viewer_1.py: import cv2 import numpy as np import cv import time cv2.namedWindow('map') t1 = 0 t2 = 0 while(1):

# reading in the image read = cv2.imread('/home/zack/v1_robotws/src/trying_again/temp_map.jpg') # frames per second t1 = t2 t2 = time.time() time_diff = t2- t1 print "fps: %r" % (1.0/time_diff)

if read is not None: # make the picture bigger bigger = cv2.resize(read,(1280,720)) # displaying the three images cv2.imshow('map',bigger)

if cv2.waitKey(150) & 0xFF == 27: break cv2.destroyAllWindows()

104