Project Proposal s3

Total Page:16

File Type:pdf, Size:1020Kb

Project Proposal s3

ECE 4006

CONVOYbots Project Proposal

Anees Elhammali Michael Malluck John Parsons Namrata Sopory

September 22nd 2003

Georgia Institute of Technology College of Engineering School of Electrical and Computer Engineering

1 Executive Summary

In the military as well as in scientific expeditions, environments are often deemed too hazardous for human presence. This project seeks to develop a remotely monitored, unmanned convoy of robots that could potentially be used for transporting materials over long distances in such circumstances. A base station will be used to remotely control and guide the lead robot for the convoy. Visual feedback obtained from the robot will ease this process. The lead robot will be made to communicate its path to slave robots over an

802.11b wireless connection. The slave robots will then follow, forming an unmanned convoy. Unique to this project is the use of Arcom’s Olympus development board with

Windows Embedded XP that will function as the robot controller and interface to the wireless link.

2 Table of Contents

1. Introduction 1

2. Project Technical Details 2

3. Tasks and Schedule 6

4. Proposed Demonstration 7

5. Marketing and Cost Analysis 8

6. Bibliography 10

3 Introduction

Today, numerous development boards with an array of embedded software capabilities are available in the market. These boards can be interfaced with hardware to achieve a wide range of functionality with varying costs. The aim of this project is to use three Arcom Olympus boards (running the Windows Embedded XP operating system), each mounted atop an Amigobot robot, to a.) control the movement of the robot, b.) establish wireless links with other boards using a wireless game adapter featuring the

802.11b protocol, and c.) transmit the path traversed by a lead robot to slave robots, causing them to follow, thus forming an unmanned convoy. Convoys of this nature may find use in military or scientific expedition scenarios where environments are deemed too harsh or dangerous for human presence. The Olympus boards will thus be used as robot controllers.

This project will be divided into different phases. In each phase, the tasks to be accomplished will be modularized and assigned to one or more team members.

The first phase of the project is research and study. Substantial research and study will be undertaken by all team members a.) to determine the feasibility of each goal b.) to identify technology, code and documentation that can be leveraged to accomplish project goals, and c.) to allow team members to familiarize themselves with the language that will be used to code the project.

The second crucial phase of the project involves the development of code. This phase will be modularized to address different aspects of the project. The first among these is the development of a robot control module that will run on the Olympus board.

The module will be used to interface the board with the robot and control its movement in

4 a given direction. To ensure that correct signaling protocols are used, the Amigobot manual [1], and code written in previous design projects will be studied. The second important task is the development of a standard module to enable wireless communication between two Olympus boards over the 802.11b protocol using the wireless game adapter. To this end, the features of the game adapter, and ethernet protocol requirements will be analyzed. The third task in this phase involves determining an algorithm to guide the lead robot. A control station (laptop) will be used to remotely control the lead robot based on this algorithm. This calls for the development of a graphical user interface that will run on the laptop. A fourth aspect of this phase is to get visual data from a CMUcam mounted on the first robot and transmit this to the control station

The third phase of the project will involve testing and debugging individual modules of code. This will be followed by the fourth phase in which the working modules will be integrated, tested and revised if the need arises. Some time will be spent earlier on to develop hardware to mount the various components (board, network adapter and voltage regulators) on top of the amigobots.

If time permits, we will work on our secondary goal which is implementing an obstacle detection algorithm for the robots in the convoy.

Project Technical Details

A block diagram of the convoy system is shown below in Figure 1. The system will be comprised of four physical parts. The user will operate the convoy with a laptop

5 and wireless hub. There will be three robots called Bot1, Bot2 and Bot3, making up the convoy. Bot1 will be the first robot in the convoy and Bots 2 and 3 will follow.

When the system is turned on the laptop will establish communication with Bot1 through the hub. On the user’s command, Bot1 will traverse a path while simultaneously storing details of its movement. The camera on Bot1 will be used to snap pictures of the environment that will be sent back to the laptop at regular intervals. The visual feedback will help the user guide Bot1 further.

To set the convoy underway, the laptop will “wake up” and establish communication with Bots 2 and 3. Bot1 will do the same. Bot1 will then transmit its path and movement details to Bots 2 over the 802.11b wireless Ethernet link. Should Bot2

6 catch up to Bot1, it will stop to wait for further instructions from Bot1, which will happen once Bot1 makes a further movement to get back ahead of Bot2.

Once Bot2 is under way, Bot3 can be activated at any point by the user. It will begin to operate in the same manner as Bot2 with the exception that it will trail Bot2 rather than Bot1. It will receive path information from Bot1 only as Bot1 receives confirmation from Bot2 that a segment of the path has been completed. This will keep the convoy in order and prevent the robots from getting in each other’s way.

The user front end will consist of the GUI depicted in Figure 2. Each of the start buttons will start that given robot and it will begin to behave in the manner described previously. When one of the ‘Stop Bot’ buttons is selected, the given robot will be stopped immediately. Any robots behind it will move normally until they catch up to the stopped Bot at which time they will then stop. Any bots ahead of the stopped Bot will not be affected. The ‘Stop All’ button will stop all three bots immediately. The pictures received from Bot1 will be displayed in the visual feedback window. Finally, indicators on the right side of the GUI will turn red whenever a Bot is stopped because of a detected obstacle.

7 If this functionality is completed with time to spare, work will be started on an algorithm allowing the bots to navigate around an obstacle and get back on the desired path. This would allow the convoy to make it to its final destination unassisted even if unexpected obstacles are encountered.

A listing of the possible hardware requirements for this project is stated in Table 1.

Table 1. Anticipated Project Hardware Requirements

Part Part Details Source Qty CMU Cams 3 Amigobots 3 Olympus Dev. OLYMPUS Arcom 1 Kit SBC-GX1 (Pegasus) with WINDOWS CE (.NET) Olympus Boards OLYMPUS Arcom 2 M0- 16 (with 16Mb onboard flash) Network Adapter WGA11B Linksys 3 (wireless (get it off game Amazon) adapter) Wireless Router BEFW11S4 Linksys 1 (802.11b) (get it off Amazon) Voltage LSN-5/10- Datel 3 Regulators D12

It should be noted that Java will be used to program the Amigobots. One point of concern is the speed of which Java will execute. Java, being an interpreted language tends to be slower than lower level languages such as C. The cut in performance has been weighed against the ease of coding in a higher level language and it was decided that more could be accomplished by working with Java.

8 Tasks & Schedule

Time management will play an important role in the development of this project.

To complete the project successfully, the project will be broken up into smaller tasks.

These tasks, as well as the people responsible for them can be seen from the Gantt chart below.

Figure 3. Gantt Chart Showing Projected Timeline.

Due to the modularity of the project, tasks such as coding the robot driver, camera driver and Ethernet communication drivers may be done in parallel. Part of this is due to the fact that Java is a high level language and allows for a greater degree of freedom between the hardware and software aspects of the design. Also, the Object Oriented nature of Java makes it very easy to break large tasks down into a series of smaller tasks.

One of the key factors to successfully completing this project is organization and documentation. Substantial amounts of code can be effectively written so long as

9 dependant functions being written in parallel operate as specified. Before actually coding the various modules, it will be necessary for each method’s function, inputs, outputs, and global variables to be well documented. This way anyone depending on this code will know how it is expected to function and can begin to code even if the other group is behind schedule.

It should be noted that there are some tasks which need to be performed in sequential order. These are potential stumbling blocks that can hold up the project. The first of these is the coding of the individual hardware drivers. These drivers are the core of what will be used to control the robot. Without them, coding the main algorithm is very difficult. Another point of concern is the second half (back-end) of the user interface. It is again dependant on the completion of the main algorithm. After the first robot is completed the user interface can be integrated into the design.

It should be noted that the timeline depicted above is a ‘proposed’ timeline, and is as such subject to change. Also, depending on the progress made by each team member in his or her assigned task, support will be provided on a need basis to other team members.

Proposed Demonstration

The robot convoy project will be tested in an empty classroom in Van Leer. The environment will contain large obstacles such as boxes and chairs. The laptop control station will be set up in the room. The wireless hub will also be in place to allow the establishment of a wireless link to the master and slave robots. Using the user interface for the project on the laptop, the lead robot (Bot 1) will be guided around the room in a

10 random path. Bots 2 and 3 will be expected to follow the same path with a reasonable time delay. The convoy will also be expected to stop correctly when directed to do so by the control station.

Marketing and Cost Analysis

Unmanned vehicles have a wide range of applications that drive a large sector of today’s technology market. The largest portion of these applications deal with reducing the risk of losing humans driving vehicles in dangerous situations, and reducing the cost associated with having humans actually operating these vehicles. Our product will have a competitive edge in this market because it could be the starting tool to any technology that aims to achieve such goals.

The primary target for our product is the military. The Department of Defense has requested over a billion dollars from the US senate to fund the military transformation project, which contains a variety of projects that use unmanned vehicles [2]. Our product could be marketed as the starting point of unbound research in such projects. The approximate cost of our project, as stated in the table 2 is $9600. The cost could be lower when larger portions are purchased. We believe that the cost of our product is affordable compared to the time and effort it saves anyone who wants to incorporate such technology in larger projects. Also, the long lifetime of such a product would enable a recovery of initial product costs.

11 Table 2. The Estimated Cost of Robot Convoy Component Cost (in dollars per unit) Amigobot Robot 2000 Arcom Olympus board 1000 CMU Camera 150 Linksys (WGA11B) wireless game 50 adapters One component cost=$3200 Intermediate cost= $9600 Linksys BEFW11S4 Wireless Router 70 TOTAL COST ~$9670

Although substantial research has been done in the field of sensing and communication for robot convoy navigation [3], this may be the first such convoy using the Amigobot interfaced with an Olympus board running the Windows Embedded XP operating system. As such, there is no existing product like this. According to research done by UCLA team of engineers, there has been a lot of effort to collaborate robots to accomplish a task in uncertain environments over the past years without much success

[4]. One reason for this may have been the tools used to transmit as well as interpret data.

With this project, we aim to introduce the Olympus board, Windows Embedded XP, the

Amigobot and the 802.11b standard as a platform to achieve results in endeavors involving collaborative robots. We aim to give a solid starting point product for all projects that need to deploy unmanned robots in uncertain environments.

12 Bibliography

[1] ActivMedia Robotics “AmigoBOT Technical Manual,” (2000), Available HTTP: http://www.ece.gatech.edu/research/labs/diglab/downloads/AmigoTech.pdf

[2] Department of Defense Transformation Project, Available HTTP: http://www.defenselink.mil/specials/transform/

[3] G. Dudek, M. Jenkin, E. Milios, D. Wilkes, ”Experiments in sensing and communication for robot convoy,” International Conference on Intelligent Robots and Systems (IROS), pp. 268-273, August 1995.

[4] M.J. Mataric, G.S. Sukhatme, E.H. Ostergaard, “Muti-robot task allocation in uncertain environments,” Autonomous Robots, vol. 14, no. 2-3, pp. 255-63, May 2003.

13

Recommended publications