<<

Robotic Platforms for Rapid Prototyping and Deployment of Sensor Systems in Support of Orbital Surveys

A. Elfes1, D. Clouse1, M. Powell1, E. A. Kulczycki1, J. L. Hall1, G. Podnar2, J. M. Dolan2

1 Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA, USA (elfes@jpl..gov/Fax: +1-818-393-5007)1 2 Institute, Carnegie-Mellon University, Pittsburgh, PA 15213

Abstract — Earth science research uses data obtained from TABLE OF CONTENTS space, the atmosphere, the oceans and the Earth’s land 1. INTRODUCTION ...... 1 surfaces to improve understanding of our planet and its 2. AUTONOMOUS AEROBOTS...... 2 natural processes. Orbital assets have so far been the 3. THE POTENTIAL OF FOR EARTH greatest source of the data that goes into atmospheric and OBSERVATION ...... 4 oceanic models, as well as into snow or vegetation cover 4. ROBOTIC OCEAN BOATS...... 5 analyses and land utilization studies, to name just a few 5. CONCLUSIONS ...... 7 applications. At the same time, they are limited in terms of 6. ACKNOWLEDGMENTS ...... 7 temporal/geographical coverage, resolution, cloud cover, REFERENCES ...... 7 and the physical properties measured by the onboard instruments. Crewed survey systems, including aircraft and ocean vessels, are expensive to deploy and have either 1. INTRODUCTION limited mission times (aircraft) or low survey speeds and Earth science research uses data obtained from space, the restricted sensorial extent (boats), while sensor networks are atmosphere, the oceans and the Earth’s land surfaces to limited in their sensor reach. Mobile robotic sensing improve understanding of our planet and its natural systems are emerging as new platforms for Earth science processes. Orbital assets have so far been the greatest source research. These platforms can be used in two different of the data that goes into atmospheric and oceanic models, modes: 1) to complement existing orbital and stationary as well as into snow or vegetation cover analyses and land sensing assets, or 2) to substitute for orbital or stationary utilization studies, to name just a few applications. assets. As mobile in situ platforms, they can provide data at spatiotemporal resolutions and geographical coverage that However, are limited by cloud cover, complement what can be collected from orbiters or temporal/geographical coverage and sampling rates, stationary sensor networks. In this paper we review our resolution, and the physical properties measured by onboard work on the development of mobile platforms for sensing a sensors. Crewed survey systems, including aircraft and changing world, and their potential use to complement or ocean vessels, are expensive to deploy and have either provide an alternative to orbital assets. We discuss the use limited mission times (aircraft) or low survey speeds and of autonomous airships for aerial surveys, and the use of restricted sensorial extent (boats). Weather or autonomous ocean surface boats for in situ surveys ocean buoys are relatively cheap, but cannot actively control triggered by orbital observations. We also discuss the their motion to reach specific areas of interest. Sensor development of software architectures for coordination and networks with geographically stationary nodes can provide control of multiple sensor platforms, and show results persistent (long-term) observation of areas of interest with obtained from field tests of aerial and ocean vehicles. We modest power and bandwidth requirements, at low to conclude with an outlook on the challenges that have to be moderate complexity and cost. The drawback is that the addressed to fully insert robotic vehicles in an Earth sensor payload available at each node of a stationary observing system. network is limited and configured prior to installation. Furthermore, these networks cannot be easily moved to another area if an event at a different geographical location is to be observed.

Mobile robotic sensing systems are emerging as new platforms for Earth science research [1]. They include unmanned aerial vehicles (UAVs), autonomous water

1 978-1-4244-3888-4/10/$25.00 ©2010 IEEE

1 surface vehicles (ASVs), and autonomous underwater in their design, but limited in their “go-to” capability. vehicles (AUVs). These platforms can be used in two Advantages and disadvantages of different aerial vehicle different modes: 1) to complement existing orbital and designs for planetary exploration are assessed in [7]. stationary sensing assets, or 2) to substitute for orbital or stationary assets. In a complementary mode, robotic Since 2003 we have been developing the autonomy vehicles are deployed to an area of interest and instrumented technologies required for robotic lighter-than-air vehicles with a sensor payload configured to the specific natural and (aerobots) [8, 9]. While the initial exploration targets were environmental processes to be investigated. As mobile in defined as and , it is clear that aerobots, situ platforms, they can provide data at spatiotemporal particularly autonomous airships, can also play a significant resolutions and geographical coverage that complements role in science data gathering on Earth. This will be what can be collected from orbiters or stationary sensor exemplified below. networks. The current prototype JPL aerobot testbed (Fig. 1) has a In a substitutive mode, robotic vehicles (particularly UAVs), length of 11 m, a diameter of 2.5 m, total volume of 34 m3, can be used to provide an alternative to orbital systems. The two 2.3 kW (3 hp) 23 cm3 (1.4 cu inch) fuel engines, desire for an alternative can have several reasons, including: maximum speed of 13 m/s (25 kts), maximum ceiling of a) an orbital asset has become unavailable (such as the 1000 m, average mission endurance of 60 minutes, static lift demise of the Orbiting Carbon Observatory); b) a sensor payload of 12 kg ASL, and dynamic lift payload of up to 16 system has to be calibrated on Earth using an aerial vehicle kg ASL. The avionics and communication systems are before being sent to space ( mimicry); c) the need installed in the gondola. for alternatives that are cheaper and/or have faster development and deployment times that orbiters. While robotic systems have provided very successful platforms for scientific research elsewhere in the Solar System (such as the Exploration Rovers), it is noteworthy that the use of robot explorers on Earth is still in its infancy. In this paper we review our work on the development of mobile platforms for sensing a changing world, and their potential use to complement or provide an alternative to orbital assets. We discuss the use of autonomous airships for aerial surveys, and the use of autonomous ocean surface robot boats for in situ surveys triggered by orbital observations. We also discuss the development of software architectures for coordination and control of multiple sensor platforms, and show results obtained from field tests of aerial and ocean vehicles. We conclude with an outlook on the challenges that have to be addressed to fully insert robotic vehicles in an Earth observing system.

2. AUTONOMOUS AEROBOTS Figure 1: The JPL aerobot during autonomous flight Robotic Lighter-Than-Air Vehicles tests. These flight tests were conducted at the El Mirage dry lake in the Mojave desert. In addition to Earth, seven other bodies in the Solar System have enough atmosphere to allow aerial exploration: Venus, The aerobot avionics system is built around a PC-104+ Mars, Jupiter, , Uranus, Neptune, and Saturn's computer architecture. The processor stack has a serial Titan. NASA has identified aerial vehicles as a strategic board interface to the navigation sensors, a PWM board for new technology for Solar System exploration [2, 3, 4], and reading pulsewidth modulated signals from the human emphasized the development of advanced autonomy safety pilot and generating PWM signals based upon control technologies as a high priority area for the operation of surface commands from the avionics software, and an IEEE aerial probes. The dense atmospheres at Titan and Venus 1394 board for sending commands to, and reading image enable the use of buoyant robotic vehicles (aerobots) that data from, the navigation and science cameras. Wireless can be either self-propelled (airships) or wind-driven serial modems provide data/control telemetry links between (balloons). These vehicles can provide extensive, low- the aerobot and the ground station, and additional video altitude geographical coverage over multi-month time scales transmitters on the aerobot provide downlinks of video with minimal consumption of scarce onboard electrical imagery to the ground station. The safety pilot can always power [5, 6]. Airships have the scientific advantage of being reassert “pilot override” control over the aerobot. able to fly to specific locations, while balloons are simpler

2 The navigation sensors currently consist of an IMU (angular rates, linear accelerations), a compass and inclinometer (yaw, roll and pitch angles), and a DGPS (for absolute 3D position). The vision sensors include two down-looking navigation cameras, one with a 360o x 180o field of view and another with a narrower FOV. Additionally, we have a laser altimeter (surface relative altitude), a barometric altimeter (absolute altitude ASL), and an ultrasonic anemometer (for relative wind field measurements). The ground station is composed of a laptop, a graphical user interface to the vehicle, wireless data and video links, video monitors and VCRs, and a differential GPS (DGPS) base station that provides 3D vehicle position estimates with an Figure 2. Aerobot autonomy architecture with major accuracy on the order of centimeters. Field tests of the JPL subsystems. aerobot are conducted in the Mojave desert.

Throttle and thrust vector controllers for hovering Aerobot Autonomy operations are currently under development to be scheduled in at low airspeeds. We have also initiated work on using Some of the core autonomy capabilities required for an the full aerodynamic model and the associated simulation aerobot include: vehicle safing, so that the integrity of the tools to develop a next generation set of robust controllers, aerobot can be ensured over the full duration of the mission which we expect will demonstrate tighter vehicle control and during extended communication blackouts; accurate and and more accurate trajectory control under wind robust autonomous flight control, including deployment, disturbances. long traverses, hovering/station-keeping, and touch-and-go surface sampling; spatial mapping and self-localization; and Autonomous Flight Control advanced perceptual hazard and target recognition, tracking At the flight trajectory execution control level in the and visual servoing, allowing the aerobot to detect and supervisory control architecture (Fig. 2), we have avoid atmospheric and topographic hazards, and also to implemented both waypoint navigation and trajectory identify, home in, and keep station over pre-defined science following. targets or terrain features.

We are developing an aerobot autonomy architecture that Waypoints are specified by the operator, who also sets the integrates accurate and robust vehicle and flight trajectory satisfying conditions that define when a waypoint has been control, perception-based state estimation, hazard detection considered reached. The approach used for waypoint and avoidance, vehicle health monitoring and reflexive navigation is called “orienteering”, where the control safing actions, multi-modal vehicle localization and objective is defined in terms of reaching the waypoint, mapping, autonomous science, and long-range mission rather than in terms of the deviation from a given trajectory. planning and monitoring (Fig. 2). Fig. 3 shows autonomous waypoint flight control tests conducted at El Mirage. Fig. 3 (top) shows waypoint flight Lower level functions in the autonomy architecture include control for a sequence of 3 waypoints. A waypoint is sensor and actuator control, vehicle state estimation, flight specified as having been reached if the aerobot is within a mode control, supervisory flight control, and flight profile Euclidean distance of 25m from the GPS waypoint execution. Intermediate level functions include vehicle coordinates. Atmospheric conditions were moderate, with health monitoring, failure detection, identification and wind speeds below 5 knots. Fig. 3 (bottom) shows an recovery (FDIR), flight trajectory and profile planning, and autonomous flight test under substantial wind conditions, image-based navigation. The latter provides GPS- with gusts up to 10 knots. The is blown off the direct independent localization, as well as local and regional waypoint route on several occasions when it is in a beam mapping. Higher-level functions include mission planning, reach (orthogonal) condition relative to the wind direction. resource management, and mission execution and Additionally, it is occasionally yawed off course when monitoring [9]. heading into or crossing the wind. Nevertheless, the autonomous flight control system was able, even under The autonomous flight system has a layered supervisory these severe wind disturbances, to visit all waypoints. control structure [9]. It oversees the main flight trajectory modes (cruise, hover and loiter), which use ascent, descent, Trajectory following is implemented by minimizing the turn and altitude controllers. These in turn command the distance (error) between the aerobot and the given vehicle attitude and thrust controllers. The first version of trajectory. A “push-broom” approach is used where the the onboard flight autonomy system has been implemented aerobot is controlled to follow a goal point that is moved using PID controllers for pitch, yaw, altitude and throttle ahead of the vehicle and along the trajectory. Fig. 4 shows control. results from a field test at the Southern California Logistics

3 Airport (SCLA), where the aerobot performed trajectory following of a predefined 3D tilted rectangle.

Figure 4 – Aerobot trajectory following. The pre-defined trajectory is shown in green, while the flight path

followed by the aerobot is shown in dark blue. The average deviation from the reference trajectory was on the order of 6 m, or approximately 1/2 the vehicle length.

3. THE POTENTIAL OF AIRSHIPS FOR EARTH OBSERVATION In light of the recent loss of the Orbital Carbon Observatory (OCO) mission, we performed a feasibility analysis of the use of an airship to either replace a portion of the science objectives of the mission, or perhaps to augment the measurements collected by a refly of the OCO mission. The results of this analysis are instructive in understanding the capabilities and limitations of an airship for carrying science instruments.

Figure 3 – Waypoint flight control. Top: three waypoints The Orbital Carbon Observatory was a satellite whose have been reached, with an operator-defined threshold principal mission was to collect periodic measurements of of 25m to the GPS coordinates of the target. Bottom: atmospheric carbon (CO2) levels at all points on the globe Waypoint flight control under severe wind disturbances. over a period of 2 years [10]. Such a set of measurements All waypoints were reached. A trajectory in red would allow the identification of carbon sources and sinks, indicates that the vehicle is flying autonomously, while and allow better modeling of the carbon cycle. It was green indicates that the aerobot is under pilot control. designed to fly in the A-Train, a line-up of science satellites that includes Aqua, Aura, CloudSat, and CALIPSO. The A- Another autonomy capability that has been implemented is Train flies in a sun-synchronous, polar orbit that allows the ability to identify visual targets for which an appropriate constant afternoon observation. It returns to the same point “signature” or filter has been pre-defined, and to switch the above the Earth’s surface every 16 days, allowing aerobot to a loitering flight mode around the target. The systematic measurements to be recorded. The principal loitering control is done using visual servoing: the position instrument on OCO was a spectrograph designed to measure of the target is computed in image coordinates, the position two CO2 bands and one O2 band. The instrument is of the aerobot is also mapped to image coordinates using cryogenically cooled and weighs about 150 kilograms. The IMU data and camera calibration data, and the flight instrument cost about $50 million. The cost of the mission controller sets a reference altitude and performs yaw control was about $380 million. The satellite was lost at launch on using image coordinates. This approach is exemplified in 24 February 2009. Fig. 4 (top), where the “science target” was selected as being a blue tarp. The visually controlled loitering trajectory An unmanned airship to carry such an instrument along with is shown in Fig. 4 (bottom). It should be noted that the supporting cryogenic equipment and avionics would be advantage of this approach is its independence from GPS. fairly large. Our design indicates that it would be about 50 meters long, and 12.5 meters in diameter. It would carry a

4 gross volume of about 4500 m3 of helium. Such an airship people would be required during take-off and landing would have a payload of about 300 kg, a pressure height of operations. The airship industry provides solutions to most 3050 meters, a maximum flight time of 12 hours, and a of these problems. In comparison to the Goodyear blimp cruise speed of 65 km/hour. (60 meters length) our airship is not large.

How would the cost of such a mission using an airship compare with a similar mission using a fixed-wing aircraft? Our analysis indicated that all the scenarios are inexpensive compared to the cost of the OCO spectrograph ($50 million). The airship scenario has high start-up costs that include both the cost of the airship and the cost of infrastructure necessary to operate it. However, the fuel and maintenance costs, as well as the flight-hour costs for the airship are quite low compared to that of fixed-wing aircraft, either civilian or (even more) military aircraft. Over time, the airship scenario becomes more competitive, eventually breaking even with the cost of other scenarios.

4. ROBOTIC OCEAN BOATS

A second project that addresses the information gap between orbital assets and manned vehicles is the NOAA OASIS (Ocean-Atmosphere Sensor Integration System) project. The OASIS vessels are long-duration solar-powered autonomous surface vehicles (ASVs), designed for open- ocean operations (Fig. 6).

Figure 5 – Aerobot visual flight control around science target. The top image shows the visual target (a blue tarp) as seen from the wide-field science camera on the aerobot. The bottom image shows the loitering trajectory executed by the aerobot around the target, using visual servoing. Figure 6 – OASIS robot boat with its primary sensor systems. Many of the challenges of a mission concept based on such an airship platform are caused by the size of the airship. A The OASIS platform is approximately 18 feet long and hangar to handle such a craft would need to be taller than a weighs just over 3000 lbs. The vehicle has a payload 747 hangar. It is unlikely that such a hangar would be capacity of 500 lbs, and is designed to be self-righting to available near remote field locations, so the airship would ensure survivability in heavy seas. It supports a wide range need to be deflated if high wind conditions were predicted. of communication links including spread spectrum radio, a We calculate it would cost about $50 thousand to fill it with cellular phone link, and an Iridium satellite link. Sensors helium. Thus recapturing the helium with a portable helium onboard the OASIS platform enable the collection of water pressurization system would be necessary. Ground salinity and conductivity data, sea surface temperature, and operations (mooring, take-off, landing) would require chlorophyll measurements. The forward payload bay several football fields of flat space, and a 15 meter tall provides space for installation of additional sensors. This portable mast for safe mooring. A ground crew of at least 6 includes a water flow-through system with manifolds and a

5 de-bubbling system which simplifies installation of new increasing data-gathering effectiveness while reducing sensors. A mast-mounted meteorological station allows the effort required for tasking, control, and monitoring acquisition of atmospheric measurements, including of the vehicles. barometric pressure, air temperature, relative humidity, • Web-based communications permitting control and wind speed, and wind direction. OASIS is also equipped communications over long distances and the sharing of with a forward-looking digital imaging system providing data with remote experts. remotely located scientists with images of atmospheric and • Autonomous hazard and assistance detection, allowing sea state conditions. automatic identification of hazards that require human intervention to ensure the safety and integrity of the Under separate NASA ESTO (Earth Science Technology robotic vehicles, or of science data that require human Office) funding, we have developed a multi-robot science interpretation and response. exploration software architecture and system called the • Onboard science analysis of the acquired data in order Telesupervised Adaptive Ocean Sensor Fleet (TAOSF). to perform an initial assessment of the presence of TAOSF supervises and coordinates a group of robotic boats, specific science signatures of immediate interest. such as the OASIS platforms, to enable in-situ study of phenomena in the ocean/atmosphere interface, as well as on The initial field application chosen for TAOSF is the the ocean surface and sub-surface. One of the key objectives characterization of Harmful Algal Blooms (HABs). Since of the TAOSF effort is to enhance the science value of HABs cannot be produced at will, field tests were conducted multiple robotic sensing assets by coordinating their in the Chincoteague Bay under controlled conditions using operation, adapting their activities in response to sensor rhodamine dye as a HAB simulant. observations, and allowing a human operator to oversee multiple assets with minimal effort [11, 12, 13]. The first integrated in-water tests of the TAOSF architecture were conducted in June 2007. The results shown below are The TAOSF architecture (Fig. 7) provides an integrated from the 21 August test, where a single OASIS boat was approach to multi-robot coordination and sliding robot- used, an aerial observation platform was deployed to human autonomy. It allows multiple mobile sensing assets provide an overall view of the test, and rhodamine dye to function in a cooperative fashion, and the operating mode tracks were laid by a manned chase boat. The aerostat of different vehicles to vary from autonomous control to platform was also tethered to the chase boat (Fig. 8). teleoperated control.

Figure 8 – TAOSF test area view from the aerostat Figure 7 – The TAOSF architecture. observation platform. The chase boat (and aerostat tether) is in the upper part of the image, the OASIS boat TAOSF supports the following features: is in the lower part. The rhodamine dye tracks used as HAB simulants and laid by the chase boat are evident in • Sliding autonomy, allowing an operator to control the the right part of the image. vehicles by setting high-level goals, such as specifying an area to monitor, or by taking direct control of the The TAOSF interface displays the boat trajectory, telemetry vehicles via teleoperation, or at other autonomy levels and position within a satellite map (Fig. 9, 10). in between. Additionally, it allows the overlay of remote sensing satellite imagery on the OASIS field test area (Fig. 9) to • Adaptive replanning of the activities of the OASIS vessels based on sensor inputs (dynamic sensing) and provide additional information to human operators. In future coordination between multiple assets, thereby work, we plan to use this satellite imagery to provide target areas for subsequent surveys by the OASIS boats.

6 A spiral search pattern was planned and executed by TAOSF (Figs. 10 and 11). The trajectories of both the OASIS boat and the aerostat test observation platform are shown in Fig. 10, and the measured rhodamine dye concentrations along the search path in Fig. 11.

Figure 9 – Rhodamine dye concentrations measured by the OASIS boat during the August 21 field test.

Because of their in-situ observation capabilities and resolution, as well as their adaptivity, telesupervised autonomous surface vehicles are crucial to the sensor web for Earth science. The telesupervision architecture underlying TAOSF is broadly applicable to a wide variety of domains beyond HABs, including ecological forecasting,

water management, carbon management, disaster Figure 9 – TAOSF operator interface showing the field management, coastal management, homeland security, and test area with a satellite overlay. planetary exploration.

5. CONCLUSIONS Mobile in situ platforms can provide data at spatiotemporal resolutions and geographical coverage that complement what can be collected from orbiters or stationary sensor networks. In this paper we reviewed our work on the development of mobile platforms for sensing a changing world, discussing the use of autonomous airships for aerial surveys, and the use of autonomous ocean surface robot boats for in situ surveys. We have also discussed the development of software architectures for coordination and control of multiple sensor platforms, and shown results obtained from field tests of aerial and ocean vehicles.

6. ACKNOWLEDGMENTS This research was carried out at the Jet Propulsion Figure 10 – TAOSF operator interface showing the GPS- Laboratory, California Institute of Technology, under a measured track of the OASIS platform, overlaid on a contract with the National Aeronautics and Space satellite image of the August 21 field test area. Administration.

REFERENCES [1] A. Elfes, G. W. Podnar, R. F. Tavares Filho, and A. Pavani Filho. “Inference Grids for Environmental Mapping and Mission Planning of Autonomous Mobile Environmental ”. In Proceedings of the Workshop Sensing a Changing World 2008, L. Kooistra and A. Ligtenberg (eds.), Wageningen University, Wageningen, Netherlands, November 2008.

7 [2] “Solar System Exploration”. Technical Report, NASA, International Conference on Robotics and Automation Washington, DC, 2003. (ICRA 2008), Pasadena, CA, May 2008.

[3] NASA. “2006 Solar System Exploration Strategic [13] K. H. Low, G. Podnar, S. Stancliff, J. M. Dolan, and A. Roadmap”. Technical Report, NASA, Washington, DC, Elfes. “Robot Boats as a Mobile Aquatic Sensor Network”, 2006. in Proceedings of the Workshop on Sensor Networks for Earth and Space Science Applications (ESSA 2009), San [4] Reh, K. et al. (2009) “Titan Saturn System Mission Francisco, CA, April 2009. Study”, NASA. Available at http://opfm.jpl.nasa.gov/titansaturnsystemmissiontssm.

[5] J. Cutts, P. Beauchamp, A. Elfes, J. L. Hall, T. Johnson, J. Jones, V. Kerzhanovich, A. Yavrouian and W. Zimmerman. “Scientific Ballooning at the Planets”. In Proceedings of the 2004 COSPAR Conference. Paris, July 2004.

[6] J. L. Hall, V. V. Kerzhanovich, A. H. Yavrouian, J. A. Jones, C.V. White, B. A. Dudik, G. A. Plett, J. Mennella and A. Elfes (2004). “An Aerobot For Global In Situ Exploration of Titan”, 35th COSPAR Scientific Assembly, Paris, France, July 20-24, 2004.

[7] A. Elfes et al. “Robotic Airships for Exploration of Planetary Bodies with an Atmosphere: Autonomy Challenges”. Journal of Autonomous Robots, v. 14, n. 2/3. Kluwer Academic Publishers, The Netherlands, 2003.

[8] A. Elfes, J. L. Hall, J. F. Montgomery, C. F. Bergh and B. A. Dudik. Towards a Substantially Autonomous Aerobot for Exploration of Titan. In Proceedings of the International Conference on Robotics and Automation (ICRA 2004), IEEE, Las Vegas, May 2004.

[9] A. Elfes, J. L. Hall, E. A. Kulczycki, D. S. Clouse, A. C. Morfopoulos, J. F. Montgomery, J. M. Cameron, A. Ansar, and R. J. Machuzak. “An Autonomy Architecture for Aerobot Exploration of the Saturnian Moon Titan”. In Proceedings of the 2008 IEEE Aerospace Conference, Big Sky, MT, March 2008.

[10] Livermore, Thomas R. & Crisp, David “The NASA Orbiting Carbon Observatory (OCO) Mission” in IEEE Aerospace Conference, Big Sky Montana, 2008.

[11] A. Elfes, G. W. Podnar, J. M. Dolan, S. Stancliff, E. Lin, J. C. Hosler, T. J. Ames, J. Higinbotham, J. R. Moisan, T. A. Moisan, E. A. Kulczycki. “The Telesupervised Adaptive Ocean Sensor Fleet (TAOSF) Architecture: Coordination of Multiple Oceanic Robot Boats”. In Proceedings of the 2008 IEEE Aerospace Conference, Big Sky, MT, March 2008.

[12] G. W. Podnar, J. M. Dolan, A. Elfes, S. Stancliff, E. Lin, J. C. Hosler, T. J. Ames, J. Moisan, T. A. Moisan, J. Higinbotham, and E. A. Kulczycki. “Operation of Robotic Science Boats Using the Telesupervised Adaptive Ocean Sensor Fleet System”. In Proceedings of the 2008

8