Optical Guidance Method for Robots Capable of Vision And

Optical Guidance Method for Robots Capable of Vision And

1 Optical Guidance Method for Robots Capable of Vision and Communication Igor Paromtchik, Member, IEEE The Institute of Physical and Chemical Research RIKEN 2-1 Hirosawa, Wako-shi, Saitama 351-0198, Japan Fax: +81 48-467-7248, Email: [email protected] (Regular Paper) Abstract— The optical guidance of robots spans research top- achieved by means of projecting a laser light onto the ground, ics of robotics, computer vision, communication and real-time as sketched in Fig. 1 in section III. The optical guidance control. The proposed method aims to improve the accuracy of system operates with the environmental model and comprises a robot guidance along a desired route in an environment that is unknown to the robot. The key idea is to indicate the numerical computer-controlled laser pointer which directs a laser beam coordinates of the target position by means of projecting a laser onto desired positions. The guidance system communicates light onto the ground. In contrast with other guidance methods, with the robot when indicating its target position and sub- which communicate the target position numerically, using optical sequent checking if the robot has attained this position. The commands avoids the need of coordinate transformation between robot’s vision system processes the color images in order to the robot’s coordinate system and that of the environmental model (“world” reference coordinates). The visual feedback and detect the laser beacon on the ground and evaluate its relative communication ensure that the robot accurately follows the route coordinates. The robot’s controller drives the vehicle toward indicated by laser beacons, and self-localization becomes less the beacon. The guidance system controls the orientation and relevant for guidance. The experimental results have proved the lighting of the laser in order to indicate target positions – one effectiveness of this method. at each instant along the planned route. When the robot reaches Index Terms— Mobile robot, optical guidance, laser. the proximity of the beacon, the guidance system shows the next target position, and the robot continuously follows the path. The visual feedback ensures that the robot accurately I. INTRODUCTION follows the indicated positions. UIDANCE means show the way while in motion, and The main advantage of this method is the improved accu- G localization is to confine within a particular area [1]. racy of guidance. It also allows implicit localization of the Guidance is passive if no localization of the guided objects robot within the environment: when the robot has reached is performed (e.g. a lighthouse guiding ships), and is active its indicated target position and has confirmed this to the when it involves communication with the guided objects. This guidance system, an adequate estimate of its coordinates in paper introduces an optical guidance method which represents the environmental model is known. Since the control system an active guidance concept. The method has been developed of the robot operates with the relative coordinates of target to guide mobile and humanoid robots in an environment that positions obtained from image processing, the transformation is unknown to the robots. between the coordinate system of the environmental model The motivation of this work originates in our experiments (“world” reference coordinates) and that of the robot as well on teleoperation of wheeled robots where the robot pose as self-localization become less relevant for guidance. (position and orientation) is obtained from dead-reckoning and The communication ability and updating the environmental is transmitted to the teleoperation system in order to update the model by the guidance system allows us to use this system as robot pose in the environmental model. The human operator a mediator for cooperative multiple robots [5]. For instance, uses this model to remotely control the robot. However, the sensor data gathered by the robots and stored in the accumulation of positional and orientational errors caused by environmental model is available to all robots in the fleet, the wheels sliding on the ground and inaccurate modeling i.e. cooperative knowledge acquisition and sharing can be results in a discrepancy between the actual robot pose in the achieved. The distribution of tasks and their allocation to the environment and its estimate in the model. An accumulation of robots is performed with the use of the environmental model this discrepancy over time makes the teleoperation impossible as a part of the guidance system. One robot can request the from some instant because of the danger of collision with the system to guide another robot to a specified destination. environment. This paper focuses on the optical guidance method and The proposed guidance method aims to eliminate this dis- its feasibility shown on example of a laser guidance of a crepancy from the process of guidance [2], [3]. The novelty mobile robot. The paper is organized as follows. The related of the work described in this paper is supported by two works on guidance of robots are discussed in section II. The patents [2], [4]. The key idea of the method is to show the concept of optical guidance and the kinematic models are target position instead of commanding it numerically. This is presented in section III. The operation of the guidance system 2 developed and the communication diagrams are considered in humans and robots and guide them in a networked environ- section IV. The path computation by the guidance system and ment. Each mobile robot is identified by a color bar code motion generation by the robot are explained in section V. stored in the database. The robot can be localized by one of The implementation and experimental results are described the visual sensors by means of measuring the distance between in section VI. The conclusions are given in section VII. the robot and the sensor. According to the desired path and The guidance of cooperative multiple robots, mapping of the estimated pose, the control command is transmitted to the the environment, tracking control and self-localization by the robot via a wireless LAN. Guidance based on localization and robot are intentionally left beyond the scope of this paper. communication is achieved. A system with a laser attached to one of two Canon VC-C1 communication cameras which are placed on a robot II. RELATED WORKS manipulator is described in [16]. The laser is precisely aligned The robot guidance involves various tasks such as: teleoper- with the optical axis of its camera and is centered over the top ation and communication, environment modeling and motion of the camera lens. This system makes it possible to measure planning, image processing and fusion of sensor data, real-time the distance to the object being viewed. The reported accuracy control for path tracking and navigation. This section deals of the distance calculation between the camera and the laser with the methods which are closely related to optical guidance, spot on the surface is roughly the nearest tenth of an inch. while reviews of other navigation methods and sensors can be A computer-servoed laser pointer projects the spot of light found in [6] and [7]. at the user-specified locations on the surface of interest for Simultaneous localization and mapping (SLAM) improves camera-space manipulation in [17]. The light spots serve as the performance of a mobile robot while navigating in an uncer- common camera-space points which are necessary to establish tain or unknown environment [8], [9], [10], [11]. The robot mapping between the cameras viewing the workspace of the explores the environment, builds the map and, concurrently, manipulator. The high precision of positioning and orienting navigates without external guidance. The target position is the end effector of the manipulator is achieved when a method commanded explicitly (in most cases, numerically) in a refer- of camera-space manipulation is used. ence coordinate system. SLAM serves to maintain the accurate Numerous publications address tracking a reference path by transformation between the robot’s coordinate system and the a mobile robot. The reference path is specified by a guidance reference one. This is required in order to enable the robot to line or, for example, magnetic markers in the ground. The on- attain the given target position. In contrast with SLAM, the board displacement sensor measures the tracking error, and proposed optical guidance method communicates the target the controller issues the adequate steering commands [18]. position implicitly (optically), that allows us to avoid the In contrast, the proposed method focuses on guidance, i.e. transformation between these coordinate systems and the need providing a feasible reference path for the robot. Tracking of self-localization by the robot. the laser beacons in the optical guidance method relies on Visual servoing makes use of image processing to control communication between the guidance system and the robot, the robot relative to a landmark [12], [13]. The target position that ensures that the next target position is indicated when the is specified by the target image captured from this position. robot has reached the proximity of its current target position The objective is to achieve convergence of the current image and has confirmed this to the guidance system. As a result, the taken by the robot toward the target one by means of control- guidance accuracy is improved. Besides, the method allows us ling the robot motion. As a result, the robot attains its target to flexibly modify the reference path. position. The optical guidance method differs from visual The advantage of the proposed optical guidance method is servoing because it produces the laser beacons which indicate due to the following features: a feasible route for the robot in the environment. Besides, the target positions are indicated precisely in the environment communication between the robot and the guidance system by means of computer-servoing the laser pointer or point- ensures that the next target position is commanded when the ers situated in the environment.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us