A Visual-Servoing Scheme for Semi-Autonomous Operation of an Underwater Robotic Vehicle Using an IMU and a Laser Vision System

A Visual-Servoing Scheme for Semi-Autonomous Operation of an Underwater Robotic Vehicle Using an IMU and a Laser Vision System

2010 IEEE International Conference on Robotics and Automation Anchorage Convention District May 3-8, 2010, Anchorage, Alaska, USA A Visual-Servoing Scheme for Semi-Autonomous Operation of an Underwater Robotic Vehicle Using an IMU and a Laser Vision System George C. Karras, Savvas G. Loizou and Kostas J. Kyriakopoulos Abstract— This paper presents a visual servoing control some small class ROVs), suffer from kinematic constraints, scheme that is applied to an underwater robotic vehicle. The due to under-actuation along their sway axis. Hovering objective of the proposed control methodology is to provide around a target can be accomplished only by fast and a human operator the capability to move the vehicle without loosing the target from the vision system’s field of view. On-line complex combinations of linear and angular velocity estimation of the vehicle states is achieved by fusing data from command inputs. This kind of teleoperation causes the a Laser Vision System (LVS) and an Inertial Measurement Unit target to oscillate inside the image frame, while in many (IMU) using an asynchronous Unscented Kalman Filter (UKF). cases the pilot fails to keep the target inside the camera’s A controller designed at the kinematic level, is backstepped optical field. The result is a poor quality inspection video, into the dynamics of the system, maintaining its analytical stability guarantees. It is shown that the under-actuated degree while the mission has to be repeated several times to come of freedom is input-to-state stable and an energy based shaping up with a satisfactory result. A solution to this problem can of the user input with stability guarantees is implemented. The be provided by implementing a semi-autonomous control resulting control scheme has analytically guaranteed stability scheme on the underwater robot. and convergence properties, while its applicability and per- The problem of keeping the target inside the field of view formance are experimentally verified using a small Remotely Operated Vehicle (ROV) in a test tank. has been examined in the past, in robotic manipulators [1], cartesian robots [2], differential drive mobile robots [3], I. INTRODUCTION and underwater vehicles [4]. Also some interesting work Underwater vehicles usually operate in circumstances has been done in [5], but all the above cases are mainly demanding dexterous operations and delicate motions, based on kinematic control laws or path planning techniques such as the inspection of ship hulls, propulsion system or which do not incorporate the (nonlinear) dynamics of the other underwater structures. In most of these cases human system and their effect on the camera field of view. intervention is essential for the mission success and the In this paper a new switching visual servo control scheme safety of the vehicle. Thus, semi-autonomous operation is is designed for semi-autonomous operation of an underwater the control mode of choice whether an operation entails vehicle that is underactuated along the sway axis. The challenging inspection and survey tasks. Depending on the proposed controller imposes a bounded trajectory around mission’s requirements, different sets of sensor suites are the center of a target, while guarantees that the target utilized for the robot’s state estimation and the environmental remains inside the camera’s optical field. The design of perception. The vehicle’s on board camera and the Inertial the controller is based on feed-forwarding the dynamics of Measurement Unit (IMU) stand out as a particularly useful the system and back-stepping a Lyapunov based kinematic sensors for tasks concerning underwater inspection. On the controller [6]. The complete state vector of the vehicle is one hand, the camera provides information of the vehicle’s obtained by asynchronously fusing data from a Laser Vision surrounding workspace, while on the other hand the IMU System (LVS) and an Inertial Measurement Unit (IMU) provides 3D linear accelerations and angular velocities. using an Unscented Kalman Filter (UKF). The human In a typical direct teleoperation scenario, the operator operator performs hovering tasks by simply providing high based on the camera video feed, navigates the robot towards level commands by means of joystick lateral inputs. The the inspection area and stabilizes or hovers the vehicle difficult part of performing the necessary manoeuvres is around a target of interest - usually a damaged area. left to the controller. In addition to the provided analytical Keeping this target inside the field of view is an essential, guarantees, the methodology is assessed by a number of but a rather tricky undertaking. The operator must perform experiments that were carried out using an under–actuated delicate moves and accurate manoeuvres, while dealing with 3 DOF ROV. strong currents, waves and also compensate for the ROV’s The rest of this paper is organized as follows: Section tether. II gives an overview of the robot’s kinematic and dynamic Additionally, teleoperation becomes even more difficult, equations. Section III describe the state estimation using considering that most of the underwater vehicles (especially the LVS and the IMU. Section IV describes the control methodology, while Section V illustrates the efficiency of G.C. Karras and K.J. Kyriakopoulos are with the Control Systems our approach through an experimental procedure. Eventually, Lab, School of Mechanical Engineering, National Technical University of Section VI concludes the paper. Athens,karrasg,[email protected] S.G. Loizou is with the School of Mechanical Engineering, Frederick University, Cyprus [email protected] 978-1-4244-5040-4/10/$26.00 ©2010 IEEE 5262 II. PRELIMINARIES Generally an underwater vehicle is considered as a 6 DOF free body with position and Euler angle vector n = [x y z Á θ Ã]T . The body velocities vector is defined as v = [u υ w p q r]T where the components have been named according to SNAME as surge, sway, heave, roll, pitch and yaw respectively. The forces and moments vector acting on the body-fixed frame is defined as ¿ = [XYZKMN]T . Fig. 1. Active Contours application The general form of the dynamics of an underwater vehicle expressed in the body-fixed frame is given in matrix form by the equations below [7]: III. STATE ESTIMATION - UKF M _v + C(v)v + D(v)v + g(n) = ¿ As mentioned before the complete state vector of the (1) _n = J(n)v vehicle is estimated by fusing data from the LVS and an IMU using an UKF. The LVS consists of a CCD camera and where: M = MRB+MA is the inertia matrix for rigid body two laser pointers which are parallel to the camera axis. The and added mass respectively, C(v) = CRB(v) + CA(v) LVS calculates the pose vector of the vehicle with respect is the coriolis and centripetal matrix for rigid body and to the center of a target which lays on the image plane. added mass respectively, D(v) = Dquad(v) + Dlin(v) is The target center and borderline are tracked using the Active the quadratic and linear drag matrix respectively, g(n) is Contours (Snakes) computer vision algorithm [8], which is the hydrostatic restoring force vector, ¿, is the thruster input implemented in the system software. Note that the center of vector and J(n) is the Jacobian matrix transforming the the Snake in the image space (utc; vtc) coincides with the velocities from the body-fixed to earth-fixed frame. center of the target (see figure 1). The vehicle used in this work is a 3 DOF VideoRay Pro The sensor model for the LVS is of the form: ROV. It is equipped with three thrusters, which are effective 2 3 utc only in surge, heave and yaw motion, meaning that the 6 v 7 vehicle is under-actuated along the sway axis. The angles Á, 6 tc 7 = h® (n·; w®) (4) 4 L1 5 θ and angular velocities p and q are negligible and we can L consider them to be equal to zero. The ROV is symmetric 2 about x - z plane and close to symmetric about y - z plane. where L1;L2 are the ranges of each laser pointer from the ® Therefore, we can safely assume that motions in heave, roll surface the target is located and w is zero mean white noise ® and pitch are decoupled [7]. However in this paper we will with covariance matrix R . The LVS is successfully used in be considering the coupling between surge, sway and roll previous works [4], [9], [6]. The analytical expression of eq. that will be affecting the surge and sway motions since this is (4) and a more detailed description of the LVS can be found important for our task. Although the vehicle is not symmetric in [9]. about x - y plane, heave motion can be considered decoupled The IMU used in this system (XSENS-MTi) provides 3D from surge and sway because the vehicle is operating at linear accelerations, 3D rate of turn and 3D orientation (Euler relative low speeds, where coupling effects are considered to angles). The IMU weights only 50 gr and it is placed at the be negligible. Due to the above assumptions the kinematic mass center of the vehicle aligned with its axes. The sensor and dynamic model of the vehicle is given by the equations model for the IMU that is implemented is of the form: 2 3 02 3 1 below: Ã^ Ã 6 7 B6 7 C 6 r^ 7 B6 r 7 C n·_ = J· (Ã) ·v (2) 6 7 ¯ B6 7 ¯C 6 a^x 7 = h B6 ax 7 ; w C (5) 4 5 @4 a 5 A m11u_ = ¡m22υr + Xuu + Xujuju juj + X a^y y m υ_ = m ur + Y υ + Y υ jυj a^z az 22 11 υ υjυj (3) m33w_ = Zww + Zwjwjw jwj + Z where w¯ is a zero mean white noise with covariance matrix Jr_ = Nrr + Nrjrjr jrj + N R¯.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us