Catherinesaunders
Total Page:16
File Type:pdf, Size:1020Kb
The Utility of Robot Sensory Devices in a Collaborative Autonomic Environment Catherine Saunders, Roy Sterritt, George Wilkie School of Computing and Mathematics University of Ulster Northern Ireland [email protected], [email protected], [email protected] Abstract –-This paper proposes an Autonomic architecture that environment and relay meaningful information to each will enable mobile robots to self-manage and collaborate by other. using control loops to monitor their internal state and external environment. The Autonomic Computing MAPE-K control In this paper we discuss the sensor capabilities of the Dr loop is used to design a Robot Autonomic Element and a Robot X80-H platform that will be used for the research. A Mapping Autonomic Element; each can exchange data and collaborate to find an object located within a room. A review of design of the proposed system is included and explained. A the sensor capabilities of the X80-H mobile robot platform is brief literature review of robotic collaborative systems is undertaken with emphasis on how useful each sensor will be to also included. The Research Background section looks at the proposed research. A literature review of other projects why self-managing software systems are needed. that feature robot collaboration is also included. Keywords - autonomic computing; mobile robot; Dr Robot II. ROBOT SENSORY DEVICES X80-H; collaboration; MAPE-K This section gives an overview of the sensors that the Dr Robot X80-H is equipped with. We assess how reliable they are and how useful they will be for our research. We will be I. INTRODUCTION using 4 Dr Robot X80-H mobile robots to demonstrate the autonomic architecture and software that will be developed In 2001 IBM announced its Autonomic Computing [1] for the research. The X80-H platform is a modified version Initiative as a solution to the ever-growing complexity of the X80 mobile robot; it features an X80 base but with an inherent in modern computer systems [2] [3]. Autonomic animated head. The head section has 2 eyes with a camera Computing seeks to solve the problems that occur when a integrated into the right eye. The robot is a differential drive system becomes too large to manage. System self- design and has two 7-inch wheels controlled by separate 12V management is essential in order to reduce the amount of motors. Its maximum speed is 1m per second but tests have human involvement required [4]. proven that moving at this speed and then stopping abruptly can cause the robot to topple over. The purpose of this research is to investigate how mobile robots could use Autonomic Computing concepts to The robot has a wireless antenna and is operated by collaborate and achieve a common goal. To collaborate sending instructions from a PC that is connected to a Dr effectively, robots need to be internally self-aware and Robot wireless router. Software cannot be downloaded aware of their external environment [5]. A swarm of mobile directly to the robot; this is quite a typical setup for a mobile robot. The PC is able to send instructions to the robot and robots, with each robot representing an Autonomic Element receive data back from the robot. (AE) makes up a larger Autonomic System. The relationship between these Autonomic Elements and how they interact To develop an application, the WiRobot ActiveX and share information in order to collaborate is the focus of Component must be added to Visual Studio. By creating an this research. instance of this COM object on a Windows Form, it is then possible to use the Dr Robot SDK and instruct the robot. To The research project will involve the design of an send the information to the X80-H via the router, the Autonomic system with multiple self-managing entities WiRobot Gateway must also be used; it is a utility program capable of exchanging data and collaborating with each running on the host PC and connects the PC to the X80-H by other [5]. The aim is to have each robot operating within an using the IP address of the robot. enclosed environment and have them collaborate to find an object. During the search for the object, they must map their A. Ultrasonic C. Human The X80-H's 3 ultrasonic sensors are positioned on the The X80-H has 2 human sensors [8] that are situated on front of the base unit and can detect an object within the the base unit and facing upwards. The sensors can detect range of 4-255cm. If an object is closer than 4cm, the humans at a distance of 5m and human movement to a distance will be displayed as 4cm [6]. The Ultrasonic works distance of 1.5m [9]. The human sensors are passive by sending a sound wave and calculating how long it takes infrared sensors (PIR) that detect levels of infrared radiation for the sensor to receive the sound wave after it bounces off emitted by humans. Unlike the 8 Infrared sensors on the an object. The Ultrasonic sensor consists of two round X80-H, the human sensors do not emit an infrared beam. objects, one is a speaker that sends the wave and the other is The return value for the sensors is between 0 and 4095. a microphone that receives the echo. We have found the When there is no human present, the left and right sensor sensors to be very accurate and have used them for basic data fluctuates between approximately 2000-2040. When a collision detection. human is present the data drops to between 1700-2000, the data is constantly changing and it is difficult to obtain a B. Infrared figure that stays the same when a human is not moving. The X80-H is equipped with 8 Infrared (IR) sensors that The Human Motion sensor detects movement, like the are capable of detecting the distance to an object. The IR Human Alarm it returns a figure of approximately 2000- sensors can detect an object when it is within the range of 8- 2040 when no human is detected and then fluctuates rapidly 80cm. To test the sensors we created a sensor event method between 0-4095 when there is movement. Testing has seen that receives continuous data from the sensors and then results of 1700-2400 when moving in front of the sensors. displays it on a windows form. The IR sensor works by The purpose of the human motion sensors is to track which sending a beam of light continuously, this is then reflected direction the human is moving, one way of doing this would back when it encounters an object. The light that is reflected be to compare the before and after readings from both back is detected by the IR sensor detector and this creates a sensors, create a threshold range representing ‘no triangle between the IR sensor, the object and the IR movement’. It would then be possible to determine if Sensor’s detector. The angle of this triangle is then movement had occurred in front of either sensor by measured and used to calculate the distance to the object [7]. comparing the new value from the human motion sensors with the old value. If someone was to walk in front of both The Dr Robot manual states that if the data returned sensors, from the left side of the robot to the right side, the from the IR sensors is >=3446 then the object is 0-8cm left human motion sensor should detect this first, making it away, if it returns a value between 5885 to 2446 then the possible to deduce the intended direction. object is 80-8cm away and if it is <=595 then the object is outside the range of detection. The data that is returned is To detect then whether a human is present, the data from nonlinear; it is therefore necessary to calculate the actual the Human Alarm sensors should be compared to the distance in cm using an Analogue to Digital conversion threshold range. To test the sensors, we created a test that method created by Dr Robot. Displaying the IR data in checked whether the left human alarm sensor data was meters instead of the raw value is more user friendly and within the range of 1900-2200, if it was then the label easily understood. displayed “No Humans”. If the sensor value dropped below or above this range the label changed to “Humans We tested the sensors and noticed that the accuracy Detected”, the sensor data did change as expected when a varied. For example, an object was placed approximately human was in front of the sensor. The sensors are limited in 20cm from the robot’s IR7 sensor, which is located on the that they cannot differentiate between people and cannot tell left side of the robot. The value returned was 3545, which whether there is more than one person or how close they are was then converted to the more meaningful value of 9cm. to the sensor. An 11cm difference between the actual distance of an object and the distance reported could cause problems if the robots D. GPS were operating in close proximity to one another. The IR sensors have proven to be less accurate than the Ultrasonic, Our version of the X80-H robot is equipped with an which tends to give consistent readings. Further testing is indoor GPS sensor that uses landmarks placed on the ceiling required to ascertain whether different lighting and to calculate the position of the robot. The GPS sensor in the reflective objects give more or less accurate results.