Development of an Underwater Vision Sensor for 3D Reef Mapping

Development of an Underwater Vision Sensor for 3D Reef Mapping

Development of an Underwater Vision Sensor for 3D Reef Mapping Andrew Hogue and Michael Jenkin Department of Computer Science and Engineering and the Centre for Vision Research York University, Toronto, Ontario, Canada {hogue, jenkin}@cse.yorku.ca Abstract— Coral reef health is an indicator of global climate Alternatively, automatic algorithms could be developed for fish change and coral reefs themselves are important for sheltering classification in a similar vein to the coral reef classifiers noted fish and other aquatic life. Monitoring reefs is a time-consuming above. After the data has been analyzed, the robot could be re- and potentially dangerous task and as a consequence autonomous robotic mapping and surveillance is desired. This paper de- deployed over the same site to enable long-term monitoring of scribes an underwater vision-based sensor to aid in this task. a particular coral reef. An autonomous tool such as this would Underwater environments present many challenges for vision- facilitate a Reef Check paradigm that is much safer to deploy based sensors and robotic vehicles. Lighting is highly variable, and would provide a bias-free and less dangerous long-term optical snow/particulate matter can confound traditional noise monitoring solution. models, the environment lacks visual structure, and limited communication between autonomous agents including divers and To this end, this paper describes an underwater sensor surface support exacerbates the potentially dangerous environ- capable of generating 3D models of underwater coral reefs. ment. We describe experiments with our multi-camera stereo Our sensor is designed to be deployed and operated by a single reconstruction algorithm geared towards coral reef monitoring. diver and ongoing research is investigating full integration of The sensor is used to estimate volumetric scene structure while the sensor with AQUA[4], an amphibious underwater robot simultaneously estimating sensor ego-motion. Preliminary field trials indicate the utility of the sensor for 3D reef monitoring (see Figure 1). The sensor is capable of collecting high- and results of land-based evaluation of the sensor are shown to resolution video data, generating accurate three-dimensional evaluate the accuracy of the system. models of the environment, and estimating the trajectory of the sensor as it travels. The algorithm developed is primarily I. INTRODUCTION used offline to extract highly accurate 3D models, however if Small changes in climate can produce devastating results on low-resolution images are used, the system operates at 7fps the world’s coral reef population1. For example, an increase using off-the-shelf hardware. in ocean temperatures of only a few degrees destroyed most The underwater environment presents numerous challenges of the coral in Okinawa in 1998[1]. Thus, coral reefs are an for the design of robotic systems and vision-based algorithms. excellent indicator of global climate change. Monitoring reefs Yet it is these constraints and challenges that make this can be a very labor intensive task. An international organiza- environment almost ideal for the development and evaluation tion called Reef Check2 was established in 1997 to frequently of robotic and sensing technologies. Vehicles operating in this and accurately monitor coral reef environments. The methods environment must cope with the potentially unpredictable six- used by Reef Check rely on hundreds of volunteer divers to degree-of-freedom (6DOF) motion of the vehicle. Currents, identify aquatic species over 100m transects. This task is very surf and swell can produce unwanted and unexpected vehicle time-consuming, error-prone, and is potentially dangerous for motion. the divers. A natural choice for sensing for an aquatic vehicle is to use Advances in computer vision and robotic technology can be cameras and computer vision techniques to aid in navigation used to aid divers in monitoring tasks performed by organiza- and trajectory reconstruction. Unfortunately a host of problems tions such as Reef Check. The development of an autonomous plague underwater computer vision techniques. For example, robot to survey a particular reef area or transect would be the turbidity of the water caused by floating sedimentation of enormous value. After a reef section has been selected (”aquatic snow”) and other floating debris can cause problems for monitoring, the robot could be placed near the reef and for standard techniques for visual understanding of the world. could then travel autonomously along the designated transect The optical transmittance of water varies with wavelength and collecting data for later analysis. Algorithms such as [2] or dynamic lighting conditions can make it difficult to reliably [3] could then be used for the automatic classification of coral track visual features over time. Dynamic objects such as fish reefs. The raw data stream could be stored for later viewing and fan coral can create spurious measurements influenc- by a human operator to identify and categorize aquatic life. ing pose-estimation between frames. These challenges have prompted recent research in robotic vehicle design, sensing, 1http://www.marinebiology.org/coralbleaching.htm localization and mapping for underwater vehicles (cf. [5], [6], 2http://www.reefcheck.org [7], [8]). (a) (b) Fig. 2. Trinocular stereo rig and example reference images with correspond- ing dense disparity. Fig. 1. The AQUA Robot (a) swimming (b) walking with amphibious legs. the control of the robot’s motion. In the terrestrial domain, sensing technologies such as stereo vision coupled with good vehicle odometry has been used to A. The AQUASENSOR construct 3D environmental models. Obtaining such odometry We have developed three versions of a standalone sensor information can be difficult without merging sensor data over package (see Figures 2 and 3) that is to be integrated into the the vehicle’s trajectory, and as a result there is a long history AQUA vehicle in the future. The configuration and limitations of research in simultaneous localization and mapping (SLAM) of each version of the sensor are discussed briefly here and the (cf. [9], [10], [11], [12]) for robotic vehicles. Terrestrial SLAM reader is referred to [17] for further details. AQUASENSOR algorithms often assume a predictable vehicle odometry model V1.0 consists of three fully calibrated Firewire (IEEE1394a) to assist in the probabilistic association of sensor information cameras in an L-shaped trinocular configuration used to extract with environment features. The lack of such predictable ve- dense depth information from the underwater environment. It hicle odometry underwater necessitates solutions which are also integrates a 6DOF inertial measurement unit (Crossbow more dependent upon sensor information than is traditional IMU-300CC) to maintain relative orientation and position of in the terrestrial domain. The algorithm presented here uti- the unit as it moves, and the necessary electronics to drive lizes passive stereo vision to obtain local depth estimates the system. The cameras and inertial unit are tethered to a and uses temporal feature tracking information to estimate base computer system that is operated on land or on a boat the vehicle trajectory. Experimental results obtained with the via a custom-built 62m fibre-optic cable. The cameras capture algorithm during recent sea trials illustrate the effectiveness 640x480 resolution gray-scale images at 30 frames per second of the approach but are difficult to evaluate objectively. Land- and all three video streams are saved for later analysis. Prior based reconstructions conducted using the same algorithm and to operation, the intrinsic and extrinsic camera parameters are hardware are presented to evaluate the accuracy of the system estimated using Zhang’s camera calibration algorithm[18] and against ground-truth trajectory data. an underwater calibration target. AQUASENSOR V1.0 was field tested in Holetown, Barbados and hundreds of gigabytes II. THE AQUA ROBOT of trinocular and synchronized IMU data have been captured using the device. The algorithm described here is designed to operate under Cable management and mobility were issues in the field manual operation and also as a component of the AQUA with this version of the sensor. This initial version of the sensor robot (see [4], [13], [14], [15] and Figure 1). AQUA is a six- required significant surface support personnel to manage the legged amphibious robot based on a terrestrial hexapod named long tether, two divers were needed to orient and move the RHex[16]. RHex was the product of a research collaboration sensor housing, and weather conditions were problematic for between the Ambulatory Robotics Lab at McGill, the Uni- the base unit and operator. versity of Michigan, the University of California at Berkeley Building on our previous design with new mobility con- and Carnegie Mellon University. AQUA differs from RHex straints in mind, we re-designed the entire sensor to accom- in its ability to swim. AQUA has been designed specifically modate a more mobile solution that a single diver could deploy for amphibious operation and is capable of walking on land, and operate. swimming in the open water, station keeping at depths to 15m For AQUASENSOR V2.0 and V2.1 we adopted a Point and crawling along the bottom of the ocean. The vehicle devi- Grey3 BumblebeeTMto capture colour stereo images which had ates from traditional ROV’s in that it uses legs for locomotion. the added benefit of a smaller footprint than our previous When swimming, the legs act as paddles to displace the water camera system. Due to the smaller size of the Bumble- and on land the legs allow AQUA to walk on grass, gravel, bee and the adoption of a smaller inertial unit (a 3DOF sand, and snow. The paddle configuration gives the robot direct Inertiacube3TMfrom Intersense4) we were able to re-design the control over five of the six degrees of freedom: surge, heave, pitch, roll and yaw. An inclinometer, forward and rear facing 3http://www.ptgrey.com video cameras, and an on-board compass are used to assist in 4http://www.isense.com (a) (b) (c) (d) (e) Fig.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us