<<

AUTONOMOUS AMPHIBIOUS THROUGH THE LITTORAL ZONE

by

Mark Borg

A thesis submitted to the School of Graduate and Postdoctoral Studies in partial fulfillment of the requirements for the degree of

PhD in Mechanical Engineering

The Faculty of Engineering and Applied Science Mechanical Engineering

University of Ontario Institute of Technology (Ontario Tech University)

Oshawa, Ontario, Canada March 2021

© Mark Borg, 2021 THESIS EXAMINATION INFORMATION Submitted by: Mark Borg

PhD in Mechanical Engineering

Thesis title: AUTONOMOUS AMPHIBIOUS ROBOT NAVIGATION THROUGH THE LITTORAL ZONE

An oral defense of this thesis took place on March 4, 2021 in front of the following examining committee:

Examining Committee:

Chair of Examining Committee Dr. Amirkianoosh Kiani

Research Supervisor Dr. Scott Nokleby

Examining Committee Member Dr. Remon Pop-Iliev

Examining Committee Member Dr. Haoxiang Lang

University Examiner Dr. Jing Ren

External Examiner Dr. Brad Buckham, University of Victoria

The above committee determined that the thesis is acceptable in form and content and that a satisfactory knowledge of the field covered by the thesis was demonstrated by the candidate during an oral examination. A signed copy of the Certificate of Approval is available from the School of Graduate and Postdoctoral Studies.

ii

ABSTRACT

The majority of autonomous robotic research is performed on aerial and land based .

Robots that operate above or below the water are less prevalent in the re-search. This is due to the complexity of the environment that water introduces to a robot. An outdoor, natural setting, which includes water, will present multiple, fluctuating variables to the robot. These variables can include, but are not limited to, temperature, height, location, amount of flotation, causticity, and clarity. An amphibious robot must have a sensor suite that is able to give an understanding of the robot's surroundings to allow the navigation algorithm to autonomously plan an obstacle free route on land and through the water while adapting to the fluctuation of the above mentioned variables.

The area close to and including the shoreline is called the littoral zone and this area is notoriously complex to navigate. An approach for generating an obstacle free path through the littoral zone is presented. Utilizing the Robotic Operating Software (ROS) framework an algorithm to achieve autonomous navigation through the littoral zone is developed. The output of the algorithm is to present an obstacle free path for an amphibious robot to navigate through the littoral zone which is one of the most difficult environments for a robot. The Autonomous

Amphibious Robot (AAR) was designed and built to test the proposed algorithm. Test results for the AAR on both land and in the littoral zone verify the methodology. The proposed algorithm could be used on a variety of amphibious robots.

Keywords: autonomous; amphibious; littoral; robot; algorithm

iii

AUTHOR’S DECLARATION

I hereby declare that this thesis consists of original work of which I have authored. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners.

I authorize the University of Ontario Institute of Technology (Ontario Tech University) to lend this thesis to other institutions or individuals for the purpose of scholarly research. I further authorize University of Ontario Institute of Technology (Ontario Tech University) to reproduce this thesis by photocopying or by other means, in total or in part, at the request of other institutions or individuals for the purpose of scholarly research. I understand that my thesis will be made electronically available to the public.

Mark Borg

iv

STATEMENT OF CONTRIBUTIONS

Part of the work described in Chapter 2, 3, and 4 has been published as:

Lau B.; Puccini L., Dobrescu O., Mezil A., 2014/2015 UOIT Capstone Design Project Report, 2015.

Borg, M., Nokleby, S. B., Puccini, L., Mezil, A., Lau, B., Dobrescu, O., Idris, J., Chan, A., Cain, C., Long, K., Thaker, K., and Lam, K., 2016, “Design, Development, and Preliminary Testing of an Autonomous Amphibious Robot,” in Proceedings of the 2016 CSME International Congress, June 26-29, Kelowna, Canada, 6 pages.

Borg, M. and Nokleby, S. B., 2019, “Proposed Algorithm for Autonomous Navigation in the Littoral Zone for Amphibious Robots,” in Proceedings of the 2019 CCToMM Symposium on Mechanisms, Machines, and Mechatronics, May 16-17, Montreal, Canada, 11 pages.

I performed the majority of the work and writing of the two conference manuscripts.

v

ACKNOWLEDGEMENTS

I would like to thank my supervisor Dr. Scott Nokleby, for his ongoing support and guidance in the completion of this thesis.

Thanks to my wife for her support over the 26 years it took me to reach this goal.

Thanks to my children Marin and Reid for their support.

Thanks to my Mother who gave me her interest in science and to my late Father who taught me to use logic to problem solve.

Thanks to my MARS lab mates especially Florentin for his ability of teaching an old dog new tricks.

And thanks to General Motors for the financial support to complete this educational journey.

vi

Table of Contents

Thesis Examination Information ii

Abstract iii

Author’s Declaration iv

Statement of Contributions v

Acknowledgements vi

Table of Contents vii

List of Figures x

Glossary xiii

1 Introduction 1

1.1 Contributions ...... 2

1.2 Outline ...... 3

2 Literature Review 4

2.1 Modes of Locomotion ...... 8

2.2 Navigation of Amphibious Robots ...... 13

2.3 Sensing Systems ...... 19

2.4 Autonomous Underwater Robots ...... 22

2.5 Communication and Guidance ...... 25

2.6 Marine Surface Robots ...... 29

vii

2.7 Autonomous Operation ...... 35

2.8 Multi-Axle Suspension ...... 40

2.9 Literature Review Findings ...... 45

3 Littoral Zone Path Planning Algorithm 47

3.1 Preliminary Testing ...... 55

3.2 Image Testing ...... 58

3.3 Summary ...... 61

4 Prototype System 63

4.1 Functional Requirements ...... 64

4.2 Physical Requirements ...... 65

4.3 First Concept ...... 66

4.4 Second Concept...... 67

4.5 Third Concept ...... 69

4.6 Fourth Concept ...... 70

4.7 Fifth Concept ...... 71

4.8 Sixth Concept ...... 72

4.9 Seventh Concept ...... 73

4.10 Eighth Concept ...... 75

4.11 Ninth Concept ...... 76

4.12 Tenth Concept ...... 77

4.13 Eleventh Concept...... 77

viii

4.14 Proof of Concept Testing...... 78

4.15 Concept Selection Criteria and Selected Concept ...... 80

4.16 AAR Prototype...... 81

4.17 Summary ...... 83

5 Experimental Testing and Results 88

5.1 Functional Requirements ...... 88

5.2 Physical Requirements ...... 93

6 Conclusions and Recommendations for Future Work 101

6.1 Conclusions ...... 101

6.2 Recommendations for Future Work...... 102

A Raw Data 114

ix

List of Figures

1.1 Path planning for an autonomous, amphibious robot to navigate through the littoral zone . . 2 2.1 Boeing's Extra Large Unmanned Underwater Vehicle (XLUUV) competition platform Echo Voyager. [7] ...... 8 2.2 Boeing's 15.5m XLUUV Orca [8]...... 8 2.3 Autonomous Underwater Robot (AUR) Jubilee with separate station keeping controls [9] . 11 2.4 Omni-Paddle 2 [10]...... 12 2.5 Swimming [11]...... 13 2.6 Whegs/wheel prototype [13]...... 14 2.7 McGill University’s robot AQUA [29]...... 21 2.8 Shoreline Detection Autonomous Surface Vessel (ASV) [31]...... 22 2.9 Watertight compartments for navigation Equipment [31] ...... 23 2.10 Unmanned Underwater Vehicle (UUV) Kingfish [34]...... 24 2.11 Teledyne’s Littoral Battlespace Sensing-Glider (LBS-G) uses changes in buoyancy to autonomously propel itself through the ocean [34] ...... 25 2.12 (UAV) charging off of the back of a (UGV) [40] ...... 28 2.13 Autonomous Boat with Synthetic Aperture Sonar [41] ...... 29 2.14 Omni-directional camera used to detect shoreline. The green radius marks distance to shore [41] ...... 30 2.15 Automated Ships Ltd and Kongsberg design of the unmanned fully-automated off shore vessel Hronn [42] ...... 31 2.16 VAIMOS Autonomous Robotic Sailboat [43] ...... 32 2.17 ADA UBC Autonomous Robotic Sailboat [48] ...... 33 2.18 Johns Hopkins Applied Physics Laboratory's Unmanned Aerial-Aquatic Vehicle (UAAV) Unmanned Aerial-Aquatic Vehicle Flying Fish [49] ...... 34 2.19 Clearpath's Marine autonomy research platform [50] ...... 35 2.20 Clearpath's UGV Unmanned Ground Vehicle Amphibious research platform Warthog [51] ...... 36 2.21 Undulating Propulsion System (UPS) Velox [53]...... 37 2.22 Autonomous Underwater Vehicle (AUV) ODIN [54] ...... 38 2.23 Mars rover Spirit [64] ...... 41 2.24 Bogie suspension system for the Mars rover design [67] ...... 42 2.25 Rocker Bogie Assembly designed by Don Bickler at Jet Propulsion Laboratory (JPL) [64] ...... 44 3.1 Schematic of the littoral zone and the position of the staging and goal waypoints ...... 48 3.2 Algorithm Flow chart ...... 49 3.3 Augmented Reality (AR) code used to identify the pose of the robot ...... 50 3.4 AR code shown on top of the Autonomous Amphibious Robot (AAR) robot. The code also functions as a launching and landing pad for the quad-copter ...... 51 3.5 Pioneer 3-DX robot with AR code mounted on top ...... 52 3.6 Keypoints node identification and size classification ...... 52 3.7 Identification and location of the obstacles by the algorithm...... 53 x

3.8 Inflation zones in dark gray shown around the obstacles identified in blue and green. The goal pose is shown with the green arrow and the calculated path through the obstacle field is shown in purple ...... 54 3.9 Pioneer 3-DX Robot with SICK Light Detection and Ranging (LiDAR) mounted on top...... 56 3.10 AAR mini-ITX computer [68]...... 57 3.11 Image utilizing known position of Pioneer 3-DX robot in office with open door . . . . . 58 3.12 Mapping of an office environment with the Pioneer 3-DX shown and inflation zones around the perceived obstacles ...... 59 3.13 An image of the obstacle field in front of the Pioneer 3-DX. Only the obstacle on the left can be imaged with the LiDAR. The map made from the obstacles identified and line traced are shown on the right ...... 60 3.14 Pioneer 3-DX robot navigating through pylons in the Mechatronics and Systems (MARS) Laboratory ...... 61 3.15 Littoral zone quad-copter image and the associated algorithm line tracing from Wawa Ontario, Canada. Image by Shutterstock.com ...... 62 3.16 Littoral zone quad-copter image and the associated algorithm line tracing over Lake Retba, Senegal Africa. Image by Shutterstock.com ...... 62 3.17 A grading of 5555 ...... 62 4.1 Voith-Schneider propeller [70] ...... 67 4.2 Prototype Borg-Voith One with twin Voith-Schneider propellers ...... 68 4.3 Prototype Borg-Voith Two with higher gunnel ...... 69 4.4 Designed and built by the 2013/2014 Ontario Tech University (Ontario Tech) Capstone Design Team [71] ...... 70 4.5 Pivoting suspension system for adjustable centre of buoyancy ...... 71 4.6 Amphibious robot design with deployable ridged floatation ...... 72 4.7 Amphibious Robot Design with deployable torpedo shaped floatation ...... 72 4.8 Lego(c) bogie suspension test model built by Dr. Scott Nokleby [73] ...... 73 4.9 Rocker bogie suspension with torpedo shaped floatation...... 74 4.10 Weighted rear wheel and limited rotation of rear bogie concept ...... 75 4.11 Flotation and water propulsion added to bogie design ...... 76 4.12 Ontario Tech Capstone Design Team Final Design [74] ...... 77 4.13 Eleventh concept with key components itemized. 1) wheels, 2) rocker bogie legs, 3) flotation hull, 4) GPS, 5) hydrogen tank, 6) axles, 7) torsion bar, 8-15) drive control, 16) jet drive motor, 17) jet drive...... 78 4.14 Half-scale prototype built by the 2014/2015 Ontario Tech Capstone Design Team . . . 79 4.15 Full scale model built by the 2014/2015 Ontario Tech Capstone Design Team [74]. . . 81 4.16 Component Layout of the AAR ...... 82 4.17 AAR climbing obstacles ...... 83 4.18 AAR climbing an incline ...... 84 4.19 AAR traversing through the littoral zone ...... 85 4.20 AAR switching from land drive to jet drive ...... 85 4.21 AAR under dual jet drive power ...... 86 4.22 Single jet drive steering testing...... 86 4.23 Freezing temperature land drive systems testing ...... 87 xi

5.1 AAR operating in snow ...... 89 5.2 AAR navigating around an obstacle identified by both the LiDAR and the cost map overlay ...... 90 5.3 Result of operating the AAR with an obstacle below the LiDAR line of sight without cost map overlay ...... 91 5.4 Canadian National Research Council Drone restriction zone map ...... 92 5.5 AAR at the goal position at the completion of the second land test ...... 93 5.6 Third land test overhead image (obstacles added to the environment are indicated) . . . 94 5.7 Third land test with obstacles and navigational path shown in RVIZ (objects in the RVIZ screenshot are indicated) ...... 95 5.8 AAR at the goal position at the completion of the third land test ...... 96 5.9 Layout of obstacles for the water test ...... 97 5.10 Algorithm output of the littoral zone with obstacles identified ...... 98 5.11 Robot Visualization (RVIZ) output showing overhead obstacles identified with LiDAR map overlay...... 98 5.12 Damage sustained to the Starboard, centre axle ...... 99 5.13 Damage sustained to the Port suspension locking angle ...... 99 5.14 AAR entering Lake Ontario with jet drives engaging ...... 100 A.1 Littoral Zones from around Canada grading...... 114 A.2 Littoral Zones from around the world excluding Canada grading ...... 115 A.3 Licence from Nav Canada to fly in drone restricted area ...... 116

xii

Glossary AAR Autonomous Amphibious Robot. ACV Amphibious Combat Vehicle. AI . AR Augmented Reality. ASV Autonomous Surface Vessel. AUR Autonomous Underwater Robot. AUV Autonomous Underwater Vehicle. CAD Computer Aided Design. CML Concurrent Mapping and Localization. DARPA Defense Advanced Research Projects Agency. EKF Extended Kalman Filter. GPS Global Positioning System. IFREMER Institut Franois de Recherch pour lExploitation de la Mer. INS Inertial Navigation System. IRID International Research Institute for Nuclear Decommissioning. JPL Jet Propulsion Laboratory. LBS-G Littoral Battlespace Sensing-Glider. LiDAR Light Detection and Ranging. LNS Littoral Navigation System. MARS Mechatronics and Robotics Systems. MIT Massachusetts Institute of Technology. NASA National Aeronautics and Space Agency. NHTSA National Highway Traffic Safety Administration. Ontario Tech Ontario Tech University. PID Proportional Integral Derivative. ROS Robotic Operating Software. xiii

ROV Remotely Operated Vehicle. RVIZ Robot Visualization. SAS Synthetic Aperture Sonar. SASR Site Acquisition and Scene Inspection. SLAM Simultaneous Localization and Mapping. SMC Sliding Mode Control. UAAV Unmanned Aerial-Aquatic Vehicle. UAV Unmanned Aerial Vehicle. UBC University of British Columbia. UGV Unmanned Ground Vehicle. UPS Undulating Propulsion System. USNS United States Navy Ship. UUV Unmanned Underwater Vehicle. XLUUV Extra Large Unmanned Underwater Vehicle.

xiv

Chapter 1

Introduction

The field of amphibious, autonomous robots is an area of increasing development. Am- phibious robots with the ability to handle rough terrain with obstacles are currently under represented in this field. The 2011 Fukushima disaster offered an unequalled opportunity for autonomous robots. Unfortunately, the current state-of-the-art did not fulfill the need leaving researchers scrambling to fill the gap. On March 18, 2015, a robot designed by the International Research Institute for Nuclear Decommissioning (IRID) and Hitachi was sent into the Fukushima Daichii Nuclear Power Plant and be- came stuck in a floor grating after travelling only 14 m while searching for the melted fuel rods [1]. The requirement for the robot was to travel more than 40 meters in an environment that included water and land based obstacles to document the damage to the reactor core, the location of the fuel rods, and the state of the area to send in a future amphibious robot. The robot failed before completing any of these goals. This dissertation is to expand the knowledge of amphibious, autonomous robots.

The author’s intimate understanding of the water comes from serving on the Cana- dian Coast Guard Auxiliary, 35 years of sailing experience, two years and 15,000 km of ocean experience sailing a 12 m boat from Oshawa to South America. These expe-

1 riences have given the author a unique understanding of the water and the technical requirements of operating a mechanical system successfully, through a wet, caustic environment. Amphibious robots that can move through heavily littered littoral zones have not been addressed in the current literature. The littoral zone is the area close to and including the shoreline. It is the transition zone from land to water. Further work in this field will bring a new, much needed understanding of the limitations, difficulties, and critical knowledge needed for an amphibious robot to autonomously navigate a safe path into, through, and out of the water. The research conducted in this dissertation focuses on the development of an algorithm that allows autonomous, amphibious robots to navigate through the littoral zone as shown in Figure 1.1.

Figure 1.1: Path planning for an autonomous, amphibious robot to navigate through the littoral zone.

2 The objective for this dissertation is to add a novel contribution to the scientific robotic knowledge base for a Littoral Navigation System (LNS) that can navigate through the littoral zone over land as well as water. Success for this metric will be measured by the attainment of the following goal: the proposed LNS will be imple- mented on an amphibious robot and will be pre-loaded with waypoints located in the littoral zone. The system will be expected to localize to each waypoint autonomously while avoiding obstacles. The robot will select the route that will present the least difficulty for the robot to navigate through the littoral zone.

1.1 Contributions

The contributions of the research contained in this dissertation are to add to the knowledge of autonomous navigation of an amphibious robot through the littoral zone. This dissertation adds to the research documented in the literature review con- tained in Chapter 2. The published work regarding amphibious robotics is limited and does not contain many examples of real-life testing. The development of an algo- rithm to allow amphibious robots to navigate autonomously through the challenging littoral zone environment represents the main contribution of this dissertation. The testing performed in this dissertation verifies the goal of developing a novel algorithm for use on amphibious, autonomous robots to traverse through the littoral zone. This algorithm is designed and has been tested to be adaptable to multiple robotic plat- forms. The contributions presented in this dissertation adds to the body of knowledge regarding autonomous, amphibious robots navigating through the littoral zone.

The development of the AAR was also a contribution to this dissertation. To prop- erly test the littoral zone path planning algorithm developed, an amphibious robot

3 platform needed to be designed and built. Chapter 4 discusses the many iterations of design that were required to finally build a successful robotic platform to test the algorithm. The design development of the AAR was to give the robot considerable suspension and obstacle surmounting ability, separate water and land drive systems, redundant drives for both water and land drive systems, a landing pad for the quad- copter that incorporated a QR code, as well as a sensor package including GPS, LiDAR, wheel encoders, water submersion sensor, and accelerometer.

1.2 Outline

The following chapters of this dissertation discuss the development of the algorithm for autonomous navigation through the littoral zone for an amphibious robot. Chapter 2 presents a literature review of the current research regarding autonomous navigation and amphibious robotic systems. Chapter 3 gives an overview of the algorithm, the research conducted, and the testing of the imaging algorithm. Chapter 4 proposes the development of an amphibious robot platform to test the algorithm. Chapter 5 dis- cusses the experimental testing and presents the results. Finally, Chapter 6 presents the conclusions and recommendations for future work.

4 Chapter 2

Literature Review

Autonomous, amphibious robots perform navigation, positioning, and other functions on land, above and/or below water. These functions are performed in a combination of these environments such as swamps and bogs. Due to large funding grants from sources such as the Defense Advanced Research Projects Agency (DARPA), a good part of the current research is focused on navigating a robotic amphibious platform, normally human guided, through complex environments such as natural and man- made disasters. An environment exposed to a natural disaster adds to the complexity of natural and man-made storm debris and complex energy environments including nuclear radiation. A notoriously hard area to navigate, for an amphibious craft, is the littoral area. This literature review is comprised of eight sections that review the following aspects of autonomous amphibious robotics: different modes of loco- motion, navigation of amphibious robots, sensing systems, autonomous underwater robots, communication and guidance, marine surface robots, autonomous operation, and multi-axle suspension.

Major issues to overcome in an autonomous, amphibious robotic system are: commu- nication, avoidance of water impingement, localization in both the terra environment

5 as well as above and below water, multidimensional Cartesian orientation, propulsion over varied terrain, extreme temperature swings, buoyancy, power consumption/cre- ation, and longevity. Operating in and under the water requires the most robust of systems and necessitates strong solutions to locomotion, sensing, communication, and navigation. Although these concerns are not exclusive to amphibious robotic sys- tems, the blending of these requirements, in a hostile environment, does present for a multi-faceted design. This literature review will look at the topic of amphibious robot systems and review current solutions in the amphibious robot field. This literature review was used to develop an understanding of current designs in the field in order to assist in the design of an amphibious robot system that can autonomously navigate through the littoral zone. This LNS will be tested to navigate littoral zones through terrain consistent with that of Ontario, Canada which will include traversing obsta- cles as large as 25 cm, substrates such as sand, mud, foliage, gravel, and temperatures that fluctuate between 30°C and -20°C.

In October, 2014 the United States Office of Naval Research released information regarding research that it had been conducting regarding multiple, autonomous sur- face vessels that can work together to offensively swarm an adversary. The goal of the project was to provide escort or protection to larger vessels while in tight ma- neuvering areas such as harbours or rivers. This research was done in response to the attack on the USS Cole in October, 2000. This autonomous system allows for multiple surface vessels to sense and communicate with each other in order to work in a cohesive fashion and swarm a vessel that has been deemed a threat. Unfortunately details regarding how the system operates in regards to the communication, sensors, and processor particulars are not available due to military secrecy [2].

The military spends considerable funds each year to understand and control the lit-

6 toral zone. For the 2014 fiscal year, the Amphibious Combat Vehicle (ACV) program of the United States military set aside $137 million in research and development [3]. In 2020 the spending for this program more than tripled to $447 million [4]. This type of vehicle is also being studied for the addition of autonomous functionality that could make an amphibious vehicle, a desirable unmanned supply delivery system for dangerous or difficult to access areas.

In 2016 the British Royal Navy conducted trials of their autonomous, unmanned, surface vessel Blade Runner. The testing of this vessel exhibited the level that the technology is currently able to achieve and the level of commitment and confidence the Royal Navy has in this technology [5]. In September, 2016 Lockheed Martin, as part of a DARPA sponsored initiative, conducted their first underwater, unmanned aircraft launch from an unmanned underwater vehicle. The AUV named Marlin launched the UAV named Vector Hawk. This testing was conducted to expand the capabilities of autonomous robots in the air and sea as well as expanding the collaboration of robots towards the common goal of military information gathering [6].

In January 2017 the American Navy held a design competition for an Extra Large Unmanned Underwater Vehicle (XLUUV). The winner of that competition was Boe- ing with their Echo Voyager [7] as shown in Figure 2.1. Two years later Boeing was awarded a $43 million contract to build the first production model XLUUV named Orca as shown in Figure 2.2. In May 2020 the contract was expanded to purchase four more systems [8]. The mission goal of Orca is to reduce the tedious and danger- ous responsibilities that manned submarines are currently performing such as anti- submarine warfare, mine warfare, and intelligence collection. This conversion from the current completely manned Navy to one that is a blend of manned and unmanned will continue into 2045. The American Navy is also developing a series of unmanned

7 Figure 2.1: Boeing’s XLUUV competition platform Echo Voyager [7]. surface vessels that will increase the offensive numbers of vessels for less money, while increasing the number of targets other military’s would have to locate during an en- gagement.

Figure 2.2: Boeing’s 15.5m XLUUV Orca [8].

Amphibious robot design has come to the forefront of the public’s attention due to a number of natural disaster events that required specialized robots to keep humans out of harms way. Such an event was the 2011 tsunami in Japan which required a varied response such as victim recovery and dealing with nuclear reactor cooling

8 failure. Requirements that the robots could traverse a multitude of mediums such as water and broken ground has introduced a surge in research on the topic. This has led to an unprecedented financial investment, such as the DARPA challenge, and an increase in knowledge in the field of AARs.

2.1 Modes of Locomotion

An amphibious robot must be capable of navigating on ground as well as on top of or under water. This blending of capabilities has historically meant that the am- phibious robot is less capable at either form of locomotion when compared to a robot that has only one form of locomotion, over one type of medium. Choosing a means of propulsion that complements both modes must balance each other so that they are both efficient and do not interfere with one another. An example of this would be snake like crawling and swimming movements or a turtle’s walking/swimming lo- comotion. , above the water, can be achieved my means such as flotation, propellers, jet drive, paddle wheels, swimming, and sails. Propulsion un- derwater can be achieved through buoyancy control, propellers, flippers, swimming, or water jet thrust. Although underwater propulsion was considered for this work it was not utilized for this research due to the complexity that is associated with it. On land, robots maneuver utilizing wheels, treads, legs, crawling, and flippers. All of these robot propulsion solutions tend to be powered by electric motors. The electric motors are usually powered by a cable tether, battery, solar, fuel cell, or an internal combustion engine using propane, gasoline, or diesel.

As important as locomotion is, it is just as important that the robot be proficient at station keeping. This is because most robots do not simply travel between way-points

9 but must also perform location specific work. Transitioning between propulsion and station keeping efficiently is important since it may take place many times during a mission. Once at a way-point a robot may be tasked to station keep that position in order to perform work such as taking samples. On land this is not a requirement that necessitates specialized equipment due to friction and gravity. Station keeping in the water poses multiple challenges such as negotiating current, wind, waves, drift, and over shoot. Propulsion for station keeping is not always as optimum as the propulsion required to traverse to a way-point and is why some robots will have two separate systems. A way to optimize the efficiency of transitioning between propulsion and station keeping is to keep the two systems physically separated. It is also advanta- geous to keep the propulsion and station keeping control systems separated as well to enable each system to be individually calibrated for precision. In this way the equipment needed is task specific and optimized to perform this task optimally. An example of this optimization would be to allow the propulsion system to focus on speed and power while a station keeping system is optimized on precision position control. Keeping the water propulsion system separate from the water station keep- ing system also gives redundancy if one system fails and the robot needs to navigate to shore.

Mobile amphibious platforms that utilize separate propulsion units for water and land navigation also gives redundancy to assist the robot to navigate through a mixed ob- stacle rich environment if one system should fail. An additional benefit of redundancy is the benefit of being able to use both types of propulsion in the littoral areas. Using both propulsion systems increases the odds of success in transitioning from water to ground or ground to water. Moving in a mixture of environments is also a difficulty since each environment has its own nuances. Mediums such as swamps, bogs, tidal planes, shorelines, storm damage, or natural disaster areas can be better navigated

10 with a robot that is able to utilize both types of propulsion at the same time. An example of this would be a walking or rolling robot that was combined with a water propulsion system such as a propeller or water jet system. In a mixed terrain environ- ment the legs/wheels could be utilized to maneuver the robot over floating obstacles while the water based thrust maintains forward thrust through the water.

The station keeping properties shown in research into the Jubilee AUR utilized two forms of propulsion. One form of thrust is used for forward propulsion and the other for station keeping. Research with the Jubilee, as shown in Figure 2.3, found that by adding independent thrusters for station keeping not only kept power usage down, since the main propulsion unit used more power than the station keeping thrusters, but the additional thrusters could also assist in low power attitude adjustment while underway.

Figure 2.3: AUR Jubilee with separate station keeping controls [9].

The thrusters on the Jubilee were sized to adjust to underwater currents. In this way the thrusters were quite small and unobtrusive. This allowed an under actuated robot to become actuated [9]. After the Japanese tsunami in 2011 there was a concerted

11 effort to address the shortfalls of rescuing injured people after a natural disaster.

Figure 2.4: Omni-Paddle 2 [10].

A result of this focus of research was the development of a new mode of locomotion which was named the Omni-Paddle 2, as shown in Figure 2.4. The goal of this design was to traverse the difficult terrain caused when debris is mixed with water after a disaster such as a tsunami [10]. This system was designed to operate in the littoral zone without the robot becoming fouled in debris. To achieve this, the robot has four spherical floats with grooves turned into them that spin to give propulsion. The spheres rotate in both directions which provides thrust in any chosen direction. The driving of these spheres allowed the test vehicle to move over liquid as well as ground. Although the prototype was successful it was highly inefficient. The next step by the authors of that work is to scale up the model to test if scale will affect performance.

In an attempt to replace an underwater diver Li, Shimizu, and Ito from the University of Tokyo proposed a swimming humanoid robot as shown in Figure 2.5. The robot prototype wears a watertight suit over its robotic form and is designed to maintain neutral buoyancy [11]. Although they were able to realize limited success with the ability of generating forward thrust as well as turning and upward moment, the robot

12 Figure 2.5: Swimming Humanoid Robot [11]. has the disadvantage that the thrust created from the fluttering kicks, of both legs, interfered with each other. Further work by that team will be to scale the prototype up and attempt to reduce the interference issues.

Bishop from the US Naval Academy is conducting research on Sea-Dragon, an am- phibious, six legged, Whegs (wheel-legs) design. Whegs is a biologically inspired form of locomotion that utilizes the biomimetic manner that a cockroach uses to walk. The Sea-Dragon is a remotely operated, weaponized platform that was designed to support military forces in the littoral zone. Due to the secrecy shrouding the research no im- ages were found and limited information is available regarding its functioning details. The paper did describe the vehicle as a hybrid Whegs/walking/track like drive. Two deployable outriggers flank the tread centre section and within these floating pontoon outriggers they will encompass some sort of Whegs system [12].

Further wheeled/paddle systems were discussed in a paper from Sun and Ma. Their proposal of an eccentric paddle mechanism is shown in Figure 2.6. Their design proposes a wheel that has a deployable paddle to allow the robot a modified “walk- ing” gait. When the paddle is retracted the unit functions as a standard rotational wheel propulsion. The fabricated prototype, with carbon fibre paddles has shown

13 promise [13].

Figure 2.6: Whegs/wheel prototype [13].

2.2 Navigation of Amphibious Robots

Navigation for an is the process of determining a suitable and safe path between a starting point to a goal location for a robot [14]. Although naviga- tion of autonomous ground based and aerial robots has been researched extensively, research into surface or underwater vessels is limited. There is very little documented research into amphibious navigation systems. This lack of research is partially due to the inherent difficulty with navigating an autonomous robot in and out of the wa- ter. A further limiting factor for research with amphibious robots is the complexity involved in testing these systems in a laboratory or natural environment. Blending what land and limited water based navigation research is available into a system that is appropriate for amphibious robots adds a level of complexity due to the fact that terrestrial, surface, and underwater environments each present their own techni- cal difficulties and do not necessarily combine these systems well. Terrestrial based

14 robots must contend with an undulating terrain covered in hard obstacles to navigate around. Underwater robots must navigate with no Global Positioning System (GPS) communication, extremely limited visual contact, underwater currents to push them off course, and operate navigational sensors within a caustic pressurized environment. Surface vessels must filter through a lot of external sensor noise from waves, operate navigational sensors within a caustic environment, and must differentiate between open water and a complex shoreline environment. These combined challenges have kept amphibious robots from advancing technologically as quickly as other robots.

Since there has been so much research on terra based systems this sec- tion of the literature review will focus on the limited number of water surface and underwater robot navigational systems and their ability to be hybridized to a terres- trial based system. The seven major robot control systems for surface and underwater vessels are: Sliding Mode Control (SMC), nonlinear control, adaptive control, Pro- portional Integral Derivative (PID), neural network control, fuzzy control, and visual servo control [15]. Of these navigation systems SMC and PID control systems are the most simple and involve the least amount of computing power to handle the complex- ity of the control algorithms.

Research on the SMC controller was performed in the former Soviet Union in the late 1960’s and is a very solid platform [16]. SMC is a control strategy that provides a control action that restricts the plant control to function according to some pre- defined simple dynamic. This dynamic is called the sliding surface and when the plant follows this sliding surface it is in sliding mode [17]. There are two main advantages of using the SMC as a control system. Firstly, SMC is not sensitive to imprecise or approximations of complex model behaviors. This is important when computing the algorithms required that it does not require a large processor to calculate large reams

15 of data to get good performance. Secondly, SMC allows a higher order system to be represented by an equivalent first order system. This in turn reduces the complexity of the control algorithm. A negative aspect of the SMC controller is that it comes with high control activity, which is called control chattering. This is undesirable since it induces unwanted actuator and servo wear, high energy usage, and course “hunting”. The SMC controller also neglects the coupled terms such as Coriolis and restoring effects [18].

The main advantage of the PID control system is its inherent simplicity and requires minimal processor requirements. PID control is a control loop feedback mechanism that calculates an error value and works to minimize that result to zero [19]. The use of PID controllers are extensively utilized in many industries. Their wide use for control is due to their inexpensive cost, robust design, and efficiency in keeping the set point while minimizing fluctuation. A drawback to PID control is that it comes with poor transient performance. This results in course overshoot or under damped response [20].

Collision avoidance for robots is a primary function. Go to goal, collision avoidance, and working with other moving targets including other robots and people have been studied. A paper from Wang of Cornell University discussed robot avoidance algo- rithms including individual robots and robots within teams of robots [21].

An amphibious robot must deal with a number of combined difficulties when navigat- ing because of the variety of environments it can encounter. A blend of solutions is best suited for this resolution. An example of this form of hybrid navigation system was proposed by Santhakumar and Asokan. They suggested that a hybrid type system of combining a SMC system with a classical PID control method reduces the tracking

16 errors arising out of disturbances [18]. This team from the Indian Institute of Tech- nology modeled their SMC/PID hybrid control logic on an AUV in MATLAB® with three controlled variables: surge, pitch, and yaw. The control inputs are thrust, rud- der angle, and bow planes. The simulated results indicated that the hybrid tracking system performed better than either the SMC or PID individually. The hybrid works by running both the SMC and PID controls concurrently while a supervisory controller continually switches between the two systems to maintain a switching threshold for the absolute error and the error rate. The next step for the team is to load the hybrid control system into an AUV robot which has not yet been attempted.

Systems that utilize vision navigation can be divided into roughly two categories: systems that require previous knowledge of an area before traversing through it and systems that are able to build a map of an area as they navigate through it. The type of systems that require a map loaded into them before moving through it are known as metric map-using systems. The systems that are able to sense an environment and map it as they navigate are known as metric map-building systems [14]. Metric map-using systems require less processing power and are the faster of the two types of systems because data of a known area is able to be pre-loaded into the processor. Having this data pre-loaded allows the robot to pre-calculate an optimum path to the waypoint before it heads off and then simply to react to obstacles as it perceives them. The benefit of this type of navigation is that the overall path can accommodate known obstacles such as ridge lines. The major drawback to this type of system is that having an up to date detailed map of an area is rarely feasible since topography can change significantly with natural forces such as tides, flooding, and wind storms. Amphibious based systems are especially exposed to ever changing maps and bound- aries.

17 Metric map-building systems gather external data from sensors such as visual sonar- based or laser range finding systems. These systems collect data of the environment by emitting sound or light waves and timing the intervals of that information bounc- ing back to the receiving sensor array. With this data the robot's on board processor builds a local map and then navigates through the map it has built. Two ways to pro- cess this data are: Simultaneous Localization and Mapping (SLAM) and Concurrent Mapping and Localization (CML). This local map specifies obstacles and what the processor perceives as navigable free space. The robot then plots the most efficient path through the navigable free space to the goal way point [14]. A significant draw backs to this type of system is the major processing power that is required to calculate the large amounts of data that need to be perceived, as well as the inability to choose the most direct course to a waypoint taking into account large topographical obstacles such as craters. The robot is continuously navigating without seeing the full layout between way points.

SLAM is the process that a robot can build a map of an environment using on-board or external sensors while simultaneously using that generated map to determine its location within that map. In SLAM, both the trajectory of the robot and the loca- tion of all of the landmarks are estimated in the processor without the need for any apriori knowledge of location [22]. One of the advantages of utilizing SLAM is that it can employ multiple sensory input data such as laser range finding, sonar range finding, and GPS data and incorporate them into one map. Using these sensory in- puts can create a lot of data for the robot's on-board processor to manipulate. This can cause processor lag. The algorithms built into SLAM handle this large amount of data using various filters. One such algorithm is the Extended Kalman Filter (EKF) (EKF-SLAM) which use the maximum likelihood algorithm for data correlation [23]. Another algorithm used is the Rao-Blackwellized particle filter or FastSLAM which

18 recursively estimates the full posterior distribution over robot pose and landmark lo- cations [24]. The best type of filter is dependent on the type and detail of the data being collected. Sometimes multiple filters are used depending on the application. Trial and error on the type of filter to use in SLAM for a given application is rec- ommended since so many variables come into play [25]. Based on the research with land and water based robots the AAR will start testing utilizing an EKF-SLAM al- gorithm [26].

Apart from a land based robot, an amphibious robot's navigational requirements are not just forward motion and obstacle avoidance but also powered station keeping. This is required when the robot must stop and hold its position to perform work. Navigating for station keeping underwater has been performed with cameras visu- ally by keeping man-made underwater pipes or cables, on the bottom of the ocean, in the cameras field of vision and adjusting the thrust of the UAV accordingly. In an environment lacking these man-made structures natural structures can also be used to track the camera orientation in order to perform station keeping adjustments. This estimate of motion of objects or features within a sequence of images is called Optimal-flow-based solutions [14].

Tuna, Arkoc, Koulouras, and Potirakis produced a MATLAB® simulation of an au- tonomous boat that could self-navigate around a body of water to take water samples for the use of validating the safety of drinking water. Since their proposed design was surface based they were able to utilize GPS position data. Their simulation utilized raw GPS data and processed it through a Kalman filter to determine the error esti- mates of the position, velocity, and misalignment errors [27]. For the possibility that the GPS receiver losses its signal, the simulation also utilized an Inertial Navigation System (INS) that could calculate the position of the vessel utilizing an accelerometer

19 and gyroscope to compute a dead-reckoning position. After many models were run through the MATLAB®-based simulation studies, it was shown that this type of sen- sor package could theoretically successfully navigate an autonomous boat with limited processing and memory resources. The simulation also showed that the navigation system was robust enough to continue to navigate even if one of its accelerometer or gyroscope sensors failed. However, verification on real-world systems was not done.

2.3 Sensing Systems

On December 15, 2016 the non-combat United States Navy Ship (USNS) Bowditch, a drone support ship, was stopped 57 nautical miles northwest of Subic Bay near the Philippines. This area in the South China Sea is considered by the United States to be in international waters. The USNS Bowditch was in the process of recovering one of two UUVs when a Chinese Navy ship approached and took the second UUV from the water [28]. After studying the drone the Chinese government returned the UUV to the American Navy. This action taken by the Chinese Government shows the high amount of interest in the technology, lack of information, and the steps that they are willing to take to ensure their Navy does not fall behind in the field of UUV knowledge.

Autonomous robots sense their environment in an effort to appropriately react to their surroundings. All sensing on an amphibious robot must be conducted utilizing waterproof sensors or from behind a waterproof barrier. This data intake can be done with cameras, sonar, ultrasonic, laser, pressure, temperature, encoder, light, sound, bump, GPS, or for redundancy, a combination of two or more of these. One technique that sensor data is used to localize a robot is the collection of visual data to acquire positioning in reference to an initial state. This data acquisition is referred to as Site

20 Figure 2.7: McGill University's robot AQUA [29].

Acquisition and Scene Inspection (SASR) [30]. This technique involves acquiring an image of the surroundings through any one or combinations of sensors. In the case of the McGill University robot AQUA, it is a visually guided amphibious robot that uses a stereo trinocular vision system that is capable of creating 3D maps of its environ- ment and can locate itself within that created map as shown in Figure 2.7. AQUA's navigation system can also take an image and plot its position. Once plotted, any de- viation to this image is calculated as movement and localization is calculated through an algorithm. This is very useful under water where GPS positioning is impossible to achieve [29]. Another application where this type of positioning is useful is when small, accurate movements must be made in order to perform a task to a high level of accuracy.

Synthetic Aperture Sonar (SAS) is the main sensor used by a robot built by the

21 Figure 2.8: Shoreline Detection ASV [31].

Faculty of Engineering at Porto University in Portugal. This type of sonar sensor generates high resolution images by simultaneously combining multiple sent/received signals [31]. The shoreline detection ASV is shown in Figure 2.8. This level of detail allows the above water robot to build detailed riverbed models of the underwater terrain it moves over. The sensors and electrical components are protected in three, watertight compartments located on the deck of the vessel, as shown in Figure 2.9.

Bomb disposal robots are almost exclusively terrestrial. A team in India worked on an amphibious system that utilizes smart nodal sensors [32]. A smart nodal sensor is also known as a mote in North America. A smart nodal sensor or mote is a node in a wireless sensor network. Each node is capable of performing some processing of data, it can gather sensory information and then communicate this information to other connected nodes in the network. A mote is a node but a node is not always a mote [33]. This article proposes that multiple robots could converge or swarm in on a possible bomb target, communicate their multiple perspectives of sensory infor-

22 Figure 2.9: Watertight compartments for navigation Equipment [31]. mation, process that information through the multiple motes, and make a decision on the status of the target. This redundancy allows for a multi robot system that gives various perspectives, collaboration, generates collective behaviour, and performs confirmation of critical decision making.

Underwater sonar is an effective sensor for penetrating water with sound waves and plotting the bottom of a body of water with a tomography map. Much work has been done with these sensors with various navies around the world to guide submarines and torpedoes. Utilization and/or scientific documentation of this type of sensor seems to be limited however in the use of autonomous robots.

23 Figure 2.10: UUV Kingfish [34].

2.4 Autonomous Underwater Robots

As of 2016, there are 255 unique configurations of UUVs in service. This number is made up of Remotely Operated Vehicles (ROVs) and UUVs [35]. ROVs are generally tethered vehicles while UUVs are not. During Operation Desert Storm two U.S. Navy ships were badly damaged by Iraqi sea mines in the Persian Gulf. After those incidents the Navy began investing heavily in mine countermeasures and initiated a program to build UUVs that could disable enemy sea mines. During Operation Iraqi Freedom in 2003, a UUV was used to help clear sea mines from the port of Umm Qasr. This deployment marked the first time a UUV was deployed in a combat environment. A larger torpedo-shaped underwater drone designed to carry a large payload of sensors down to a depth of 600 m is currently under development and it has been named Kingfish and is shown in Figure 2.10. By 2020 the Kingfish is expected to replace the U.S. Navy’s Marine Mammal Program which is a current program that utilizes dolphins and sea lions to detect sea mines [36].

The U.S. Navy has requested Teledyne Brown Engineering, Inc. to produce the Slocum Glider as shown in Figure 2.11 [34]. The Littoral Battlespace Sensing-Glider (LBS-G) is a $53.1 million contract to supply 123 units of the 2 m long UUV. Teledyne’s glider

24 Figure 2.11: Teledyne's LBS-G uses changes in buoyancy to autonomously propel itself through the ocean [34]. uses changes in buoyancy to propel itself forward through the water. The Slocum Glider uses mechanical means to change the buoyancy by using the temperature and pressure changes of the ocean. To sink, a valve allows oil from a flexible bladder on the outside of the UUV to flow in, lowering volume but keeping mass constant. This causes the glider to sink. Liquid wax inside internal tubes shrinks and solidifies as it descends into colder waters, letting oil in. The glider will continue descending to about 1.5 km, then another internal storage tank filled with oil and nitrogen at 20,700 kPa pressure is used to force oil back into the external bladder. As the now-buoyant glider heads toward the surface, its wings provide lift and forward motion, while sen- sors in the nose gather oceanographic data. The glider will eventually surface about 5 km away, where surface water temperatures of about 27°C liquefy the wax again. Liquids cannot be compressed, so the liquid wax forces the oil and nitrogen back into

25 the storage tank, resetting the gas for the next cycle. At the surface, the glider re- ports its position, transmits data via satellite, and receives any sent commands. The propulsion system means the glider can be at sea for weeks at a time. In 2009, a Slocum Glider managed to cross the Atlantic in 223 days. The US Navy plans to use its 123 UUV fleet of deep and shallow water LBS-G gliders to acquire critical oceanographic data, which will improve positioning of fleets during naval maneuvers, submarine hunting, and hiding [37].

2.5 Communication and Guidance

To best cover an area efficiently, multiple autonomous robots can communicate to- gether and share sensor data in order to drive guidance. This collective behaviour is referred to as swarm intelligence. Work performed by Ragi, Tan, and Chong studied that multiple land, water, and amphibious robots could combine laser, GPS, and sonar data to best calculate the optimal way to reach a person moving through a flooded stream scenario. The team used a partially observable Markov decision process to find the best mathematical solution to guide the robots towards the victim moving in a flooded stream [38]. The study also used nominal belief-state optimization to solve the partially observable Markov decision process in order to reduce the computational requirements. The paper was able to mathematically show the plausibility that mul- tiple robots can successfully share sensor data and intelligently make a decision on which robot, operating on which medium, could best get to the victim floating down the stream. The algorithms proposed by Ragi, Tan, and Chong also formulated the best trajectory for the to intersect the flood victim while taking into account the speed and path of both the robot and victim’s future juncture position. Future work will include expanding the algorithm’s decision making to include inter-

26 ception course calculation for multiple victims.

UAVs are being used as a way to incorporate overhead data for UGVs in the assistance of natural disaster response [39]. It was discussed that overhead data can compare a priori data with current disaster situations. This would give updated, relevant in- formation for ground or water based robots for navigation through zones affected by a natural disaster. The presentation focused on having the UAV data uploaded to a central command station. This central command station would then amalgamate the data and redeploy it as required. This was referred to as disaster information fusion. It was also proposed that a heterogeneous smart gathering system and analysis system could relay data to ground stations including individuals and UGVs

A limitation to UAV flight time and the distance traveled is driven by the time be- tween charging sessions. A way to extend this time by proposing communication between the UAVs and multiple UGVs is presented in Field Robotics [40]. This com- munication would allow the UAV to chose an UGV on the edge of the intended flight path and on the extremity of battery power, fly down to that UGV land and charge. By having the location of all the robots available to charge from it would be able to extend the range of the UAV as shown in Figure 2.12.

In order to properly guide an autonomous, amphibious robot it is very important to understand where the shoreline is. This is significant for two reasons. The first is that the robot must know when to deploy the appropriate locomotive device to suit the medium it is in. This is especially important when the robot has separate propulsion systems for water and land. The second significance for understanding exactly where the shoreline exists is because the shoreline position changes from moment to moment. Factors such as tide, waves, wake, and natural disasters change the position of the

27 Figure 2.12: UAV charging off the back of a UGV [40]. shoreline constantly and sometimes permanently. Relying on maps and tide tables is not sufficient for a robot to guide itself to the land/water transition and trust the water’s edge will be where it believes it should be.

Subramanian, et al. from Virginia Tech have researched an omni-directional camera setup mounted on an ASV, as shown in Figure 2.13 that uses a photographic im- age to differentiate the delineation of water and land. The research found that the shoreline in an image could not be detected from a gray scale image since gray scale does not have sufficient discrimination details. A colour image, however, did not have this limitation [41]. The omni-directional image, as shown in Figure 2.14, shows the vessel in the centre of the frame. This self-image of the boat allows the robot to

28 Figure 2.13: Autonomous Boat with Synthetic Aperture Sonar [41]. accurately calculate its distance to shore. Noise and false images from waves were able to be filtered out with the use of a circular median filter. This prototype took two seconds to identify the shoreline from an image. Further research by this group will include reducing the processing rate to real-time which would be important for an autonomous robot to operate efficiently.

In November of 2016 the British company Automated Ships Limited and Norway’s ship builder Kongsberg Maritime signed a memorandum of understanding to build an unmanned and fully-automated vessel for offshore operation as shown in Figure 2.15. Automated Ships Ltd. is designing an autonomous ferry for Automated Ships Limited. Kongsberg Maritime will build the mechanical systems consisting of the hull of the ship and propulsion systems. The ship will be named Hronn. The design is to be a light-duty, offshore utility ship servicing the offshore energy, scientific/hydro- graphic, and offshore fish-farming industries. The intended uses include survey, ROV and AUV launch and recovery, light inter-modal cargo delivery, and open-water fish farm support. The vessel can also be utilised to provide firefighting support to an offshore platform working in cooperation with manned vessels. Hronn will initially operate and function as a remotely piloted ship but will transition to fully autonomous

29 Figure 2.14: Omni-directional camera used to detect shoreline. The green radius marks distance to shore [41]. operations as the control algorithms are developed concurrently during remotely pi- loted operations. Unfortunately this ship was not built as of December 2020.

2.6 Marine Surface Robots

Studying the operation of marine surface robots is important for the design of an amphibious robot due to the necessity for an amphibious robot to handle similar water based challenges. In this way amphibious robots can utilize similar design so- lutions. Issues that marine surface robots and specifically sail powered robots face are discussed in a paper by Cruz and Alves. The authors discuss challenges that sail powered robots face such as power creation, navigation, obstacle avoidance, and data

30 Figure 2.15: Automated Ships Ltd and Kongsberg design of the unmanned fully- automated offshore vessel Hronn [42] transmission [44]. Functions on sailing robots such as rudder control while adjusting for factors such as wind and waves are factors that affect all amphibious designs. Algorithms that constantly adjust sail trim to maximize performance and vehicle stability can be utilized to keep a powered robot trim on water [43]. Stelzer and Proell developed an algorithm for autonomous sail trim for sail boat racing condi- tions [45]. Researchers from Institut Franois de Recherch pour lExploitation de la Mer (IFREMER) the Department of Technological Development in France developed an autonomous, 2.4 m sailing, keel boat with the capability of taking temperature and salinity readings in an ocean environment [46]. The vessel is named VAIMOS and Figure 2.16 shows the scale of the robot. Although the boat was strictly designed for the water, the robots ability to track through waves and navigate open water is rel- evant to amphibious watercraft as well. The VAIMOS autonomous sailboat utilized GPS triangulation to locate itself. VAIMOS then used two algorithms to navigate the boat. One algorithm was used for rudder control and the other for sail control [47]. The rudder control algorithm is controlled by the sail control since tacking up wind

31 Figure 2.16: VAIMOS Autonomous Robotic Sailboat [43]. was necessary to achieve all points of sail. The apparent wind direction and strength is taken off of a sensor located at the top of the mast. Sail trim is adjusted utilizing a servo motor located within the waterproof hull.

In July of 2016 a team from the University of British Columbia (UBC) built the autonomous sailboat ADA and attempted to cross the Atlantic ocean as shown in Figure 2.17. Unfortunately five weeks after leaving Newfoundland, Canada the robot experienced a failure in the rudder linkage and lost satellite link 800 km off of the coast of Portugal. The robot was able to travel 4,000 km autonomously, reached a top speed of 23 km/h, recovered after storm-induced loss of power, withstood 100 km/h winds and survived for over four weeks with core systems intact on the open ocean [48].

32 Figure 2.17: ADA UBC Autonomous Robotic Sailboat [48].

The Unmanned Aerial-Aquatic Vehicle designed at the Johns Hopkins Laboratory is a proof-of-concept remotely operated, flying vehicle that has the ability of operating above and below the surface of the water as well as flying in the air as shown in Fig- ure 2.18. The propeller speed while operating in the water is slow but can accelerate to drive the carbon fibre flying wing into the air. The purpose for the design is to allow for rapid relocation to take numerous water samples in a short period of time. By flying between water sample locations the overall time the robot is deployed is reduced [49].

The water surface vessel Kingfisher, designed and built by the Canadian firm Clearpath Robotics, is a marine autonomy research platform, as shown in Figure 2.19. The robot is built as a platform for universities and research facilities as a base robot to build upon. The boat has a 24 cm high x 34 cm wide x 7.6 cm deep hold that allows for 10 kg of cargo which can be used for sensor arrays. It is a catamaran design with a rudderless jet drive in each hull including reverse gear. The boat has an operat- ing temperature range of -10°C to 30°C. Research has already been performed by Ramirez-Serrano, at the University of Calgary, to outfit and program the Kingfisher

33 Figure 2.18: Johns Hopkins Applied Physics Laboratory’s UAAV Unmanned Aerial- Aquatic Vehicle Flying Fish [49]. as a harbour patrol vessel and perform security functions [50].

Clearpath also has an amphibious robot named Warthog as shown in Figure 2.20. It is a surface, amphibious, robotic research platform. This robot is designed to be used in mining, agricultural, and environmental monitoring. The Warthog robot is able to travel 18 km/hr on land and 4 km/hr on top of the water. The robot is 0.83 m high x 1.38 m wide x 1.52 m long, allows for 272 kg of cargo to be towed with its optional trailer hitch. The body of the Warthog is articulated so that the left and right sides pivot off of the port to starboard centre line [52].

34 Figure 2.19: Clearpath’s Marine autonomy research platform [50].

Pliant Energy Systems has developed an undulating winged robot that they call Velox. This robot utilizes a compliant undulating wing that they refer to as an Undulating Propulsion System (UPS) as shown in Figure 2.21. The robot induces a wave into a compliant material to produce static thrust on land, underwater, ice, and snow [53]. The Velox robot is currently not autonomous. An operator steers and controls all the robot’s functions remotely through a tether. The wave propulsion on this robot mimics that of a millipede, snake or cuttlefish. The builder of this robot states that a main benefit of this type of propulsion is its inherent resistance to entanglement and low environmental impact.

2.7 Autonomous Operation

Water covers about two-thirds of the Earth's surface and currently this medium poses a lot of difficulty for communication with robots. While an amphibious robot is on the surface of the water or land it is relatively easy to receive data and give instruc- tion. Autonomy of the robot becomes important when communication cannot be maintained and the robot must make decisions on its own. Examples of this are when the robot is in remote locations or when communication is restricted such as being

35 Figure 2.20: Clearpath’s UGV Unmanned Ground Vehicle Amphibious research plat- form Warthog [51]. underwater.

Yuh has proposed a neural network control system using a recursive adaptation algo- rithm with a critic function [54]. This basically is the same as a reinforced learning approach. The distinct attribute of this controller is that the system adjusts itself directly and live without an exact model of the vehicle dynamics. Yuh tested this control on the ODIN, as shown in Figure 2.22. This analysis studied the ability of the neural network control system to control ODIN 's heading keeping control as well as control and positioning of a arm. This study was performed since most commercial sensors, for vehicle position and velocity, do not meet the accuracy of the manipulator. Results indicated that this type of control can successfully make coordinated AUV body and manipulator movements autonomously and accurately.

Work performed by the Woods Hole Oceanographic Institution studied the precision

36 Figure 2.21: UPS Velox [53]. of the current projection of latitude and longitude coordinates on the curved surface of the earth and the rectilinear coordinates of a Cartesian grid [55]. Errors between internal robot navigation integration errors and the absolute GPS calculated position of a robot are discussed. The algorithm written in the Python language proposed in this paper calculates the goal position of the robot into a distance and position from the current position of the robot as stated from the origin. This approach removes correction and position repeatably errors at 2 to 4 meter tolerances. The technique proposed in the paper also reduced the occurrence of GPS variation as the robot ap- proaches the goal set by the operator. Since the algorithm plots a course to the goal from the origin set by the original GPS fix it is not necessary for the robot to maintain a GPS lock while traversing to the goal. This is imperative for a robot operating in the littoral zone since foliage, waves, and severe robot movement due to the terrain makes GPS fixes difficult. This approach to GPS based goal navigation is utilized for the AAR.

Autonomy is needed to make intelligent decisions when considering data from multiple sensors. An absorbing, back-scattering, and colour-distorting medium such as water causes great difficulty in using video images. This is due to the highly non-uniform and multi-directional illumination. This becomes compounded if artificial light is used

37 Figure 2.22: AUV ODIN [54]. because the robot is moving and the artificial light creates its own shadows. Water also confounds sonar based sensors due to the large amount of background noise from moving marine life, seismic noise, and topography. Decisions on what data to accept from multiple types of sensors is important for the robot to make operational choices. This is why currently there is little information on the use of sonar data on an am- phibious robot while in or under the water.

The fuzzy guidance controller is another controller used by autonomous robots. Some of the advantages of a fuzzy guidance controller is that it is relatively easy to develop, simple to tune, and robust to external disturbances. It also does not require a com- plex mathematical model which requires a high amount of processing power. To test the fuzzy guidance controller Vaneck from the Massachusetts Institute of Technology (MIT) built a 1.4 m fiberglass tug boat model and tested the system [56]. To keep the algorithm simple, dead reckoning was utilized between GPS position updates. The boat was given a set of waypoints and the logic dictated the heading to steer toward

38 an “aim point”. This point was set ahead of the waypoint on the line aligned with the crossing heading through the waypoint. This allowed the boat to settle onto a course heading before crossing into the arrival circle which reduced the possibility of the boat overshooting the waypoint. To further simplify the algorithm the commanded steering angle was reduced to three settings: small, medium, and large rudder angles. The results of the study showed that the robot had no difficulty keeping the boat on a heading and achieving its waypoint targets. The controller resulted in smooth vehicle tracks in calm to medium wave disturbances. It was even able to plan a return approach to a missed waypoint target.

The company Almotive fabricates systems that are added to an existing automobile platform to enable it to become autonomous [57]. The company’s primary product is centred around the ability to use camera images as a primary input for their recogni- tion algorithm to recognize more than 100 different object classes including passenger cars, bicycles, motorcycles, animals, drivable road surfaces, and pedestrians. Simi- lar research into Autonomous vehicles is being done by the General Motors Regional Technology centre in Oshawa, Ontario. Further investment into Autonomous system technology in 2020 included the addition of an Autonomous System design centre in Markham, Ontario, Canada to house an additional 300 engineers. This group’s work with the application of adding autonomous operation to existing General Motors vehi- cles has produced such autonomous driving features such as Ultra Cruise [58]. Part of this investment was also purchasing the existing autonomous vehicle technology com- pany Cruise for more than $1 billion USD in March, 2016 [59]. Cruise Automation is testing their autonomous systems on a fleet of General Motors electric vehicles in San Francisco, California [60].

The Comma.ai company produces an Artificial Intelligence (AI) system for self-driving

39 cars. The main difference that Comma.ai feels that their company offers over the com- petition is that their algorithm learns to mimic specific driver behaviours and plans maneuvers by simulating future events in the vehicle's path [61]. The Comma.ai algo- rithm samples driving styles preferred by the human driver who purchases the system and adds it to the learning base of the system. In this way the individual driver feels more comfortable relinquishing control of the vehicle because the Comma.ai system will continue to drive the car in a fashion similar to that which the driver is accustomed to. Unfortunately Comma.ai had difficulty licensing the product with the National Highway Traffic Safety Administration (NHTSA) in the United States. The NHTSA is enforcing the law in California that requires an autonomous vehicle not operate on a public road unless the owner/operator holds a $5 million insurance policy and proof of a training program for anyone operating such a vehicle. In an attempt to circumvent this law Comma.ai on December 2, 2016, made their source code for their algorithm open source [62] and is allowing the general public the ability of setting up their own autonomous systems for their vehicles using the Comma.ai software. The Comma.ai algorithm gives similar self-driving capabilities as as the self-driving functions of the Tesla Models S and X cars [63].

2.8 Multi-Axle Suspension

The terrain that an amphibious robot must traverse can be varied. These include, but are not exclusive to: water, mud, gravel, sand, large rocks, and other varied natural and human made obstacles. The National Aeronautics and Space Agency (NASA) designed their Mars rovers for similar terrain. They specified that the Spirit and Op- portunity rovers must be designed to traverse obstacles of at least the diameter of its wheels, sloped terrain of at least 20° of tilt, and operate in soft deformable soils [64].

40 On March 22, 2010, Spirit stopped communicating with Earth after it got stuck in a sand trap [65]. With the loss of the Spirit rover Engineers at the Georgia Institute of Technology [66] have proposed an elaborate combinations of motions utilizing the ex- isting suspension and drive capabilities that the Spirit rover has. The movements are based on a combination of swimming, walking, and rolling. These movements could help future rovers avoid a similar fate to Spirit and possibly allow them to traverse even more challenging terrains instead of navigating around them.

The solution to the design specifications that the JPL and the California Institute of Technology came up with was an autonomous robot with a six wheel drive, four wheel steering, and rocker bogie suspension system as shown in Figure 2.23.

Figure 2.23: Mars rover Spirit [64].

41 Figure 2.24: Bogie suspension system for the Mars rover design [67].

The rocker bogie suspension system was invented by Don Bickler at JPL. The key design points that have made the rocker bogie suspension system successful are the articulated joints that allow six contact points over uneven terrain, independent steer- ing for turns within the robots length, centre of mass near the pivot point of the rocker bogie, the suspension system is passive, and the ability to withstand 45° inclinations in any direction without overturning. This design allows for the traversing of obsta- cles as large as the diameter of the wheels, which is 25 cm. The robot can be steered by articulating the forward and rear wheels or by tank style skid steering [67]. In order to save weight, the rovers were designed without steering actuators to pivot the forward and rear wheels but simply a locking mechanism. With the lock un-pinned the wheel is turned forward or reverse to achieve the angle required to turn. Once the wheel angle is achieved the pin secures the angle and the rover is then driven into the turn. Although cumbersome and time consuming to turn, this design is simple, light, and requires less drives that could eventually fail.

42 A robot utilizing skid steering only would further simplify this design by reducing the weight of the pin and locking mechanism. Some of the turning radius and accuracy of turns would be lost but the overall weight savings and simplicity of the design would be gained. The rocker bogie suspension, as shown in Figure 2.24, enables the vehicle to passively keep all six wheels on the ground at all times. There are two key advan- tages that this system has over other suspension systems. The first advantage is that all six wheels have the same amount of weight from the robot at all times. This is important while traversing soft terrain. Each wheel only takes one sixth of the weight at any given time which reduces the chance of the robot sinking into the ground. The second advantage is that while the robot is climbing over hard, uneven terrain, all six wheels will propel the robot which will increase the chance of multiple wheels getting a good grip. The rocker bogie suspension system, designed by JPL, was cali- brated to be somewhere between a luxury vehicle and a passenger pickup truck. This translated to an impact load of no more than 6 G’s and does not let the robot body deflect below a 20 cm ground height. The clearance was set to this minimum distance so that sensor arrays on the robot would not come in contact with the Martian surface.

The material for the structural members that make up the bogies is tapered, welded, titanium box beams, as shown in Figure 2.25. This material was chosen for its high strength to weight ratio, impact resistance, weld-ability, and exceptional bending and torsional capability. Being able to weld the material allowed the box beams to be shaped and tapered to optimize the weight for the structure throughout its length. A box beam design is a mass efficient geometry for structures that are subjected to both bending and torsional loads [64].

In order to keep the main body of the Spirit and Opportunity rovers as level as possible, while climbing obstacles, a differential system was devised. The robot’s

43 Figure 2.25: Rocker Bogie Assembly designed by Don Bickler at JPL [64]. body is attached to the left and right bogie assemblies through a geared differential mechanism. The differential is composed of two sets of gearing. One set of gears is planetary and the other is star gear configured. This gear configuration gives an opposite hand rotations. The two gear assemblies are tied together utilizing a hollow, titanium, torque tube. Therefore, when the right side suspension goes up, the left side suspension is pushed down. This induces an equilibrium of pressure on the six wheels which is advantageous on soft ground. The larger Curiosity rover utilizes an overhead torsional bar differential design. The different differential systems on the Spirit/Opportunity and Curiosity rovers was because of the solar panel array interfer- ence on Spirit and Opportunity. The torsional bar differential design has less moving parts and employs a simpler mechanical design.

There are six drive motors for the rovers. They are identical, in-hub, brushless, per- manent magnet, 28 Volt, DC motors with an encoder built in to determine wheel

44 odometry. Each motor has a planetary gearbox with a 1,500:1 ratio. Under full load the motors pull a current of 2 Amps and produce about 90 N-m of torque at a wheel speed of 2 RPM. This in turn produces a speed of 2.6 cm/s and can oper- ate down to -55°C. The Spirit and Opportunity rovers are powered by solar panels while the Curiosity rover is powered by a radioisotope thermoelectric power system. The drive package on the Spirit and Opportunity rovers allows for climbing 25 cm obstacles located on a 20° upward slope at an approach angle of 45°. The location of the drive motors allows the rover to keep its centre of gravity very low on the robot.

The vehicle’s centre of mass is very close to the pivot point of the rocker-bogie sus- pension. This positioning is what allows the vehicle to be tilted in any direction of up to 45° without overturning. To safeguard the mission objective, of collecting data from Mars for as long as possible, the rover is not subjected to angles greater than 30° [64]. One of the compromises of the Mars rover designs is that they use relatively smooth, small diameter wheels. Although the rovers perform risk-adverse missions, in more than one situation, the rovers experienced almost 100% slip which left the rovers essentially immobile. Cautious maneuvering of the four outer wheels and their independent drive trains eventually positioned the rovers out of the loose soil. The small diameter, relatively smooth, aluminum wheels were chosen for their abrasion resistance to rocks and their ability to fit within the Mars lander.

2.9 Literature Review Findings

As seen in the multiple variations of amphibious mobile robotic systems presented in this literature review, it is apparent that the amphibious robot is still in its in- fancy when compared to terrestrial or aerial robotic systems. Based on this literature

45 review, the limitations for amphibious robots seem to lie with the lack of accurate location data. This, in part, is due to the shortage of appropriate sensors for local- ization. Work on hybrid sensor arrays, multiple locomotion techniques, and advanced suspension systems have advanced the field of amphibious robots in recent years. An- other major limitation is the ability to safely traverse the littoral zone. Increases in funding have opened up new possibilities for research but further work is needed to make it as viable as terrestrial or aerial robots. It is clear that an amphibious robot capable of traversing the littoral zone and operate equally well on land and on water in an autonomous manner would be a major contribution to the robotics field.

46 Chapter 3

Littoral Zone Path Planning Algorithm

To find the optimal path out of or into the water, it is proposed to use an overhead image of the current littoral area. Since the littoral zone is dynamic, human choice of an approved exit and entry area will be chosen and then compared to the current situational status. The visual overhead reassessment of the area while the robot is at the staging waypoint located just outside of the littoral zone will allow for any debris such as flotsam and jetsam to be addressed and categorized as an obstacle or a surmountable object apriori to the time the robot enters the zone.

The proposed algorithm identifies obstacles based on a series of adjustable variables. These variables allow for the parameters of any amphibious robot configuration. For the purpose of this work, the algorithm variables are calculated to suit that of the AAR’s size, obstacle surmounting abilities, and dexterity. The algorithm calculates for the most favorable navigational path from the staging area either in the water or on land. This staging waypoint for the robot is selected by a human and will be the entry or exit starting waypoint for a path through the littoral zone to the goal as

47 Figure 3.1: Schematic of the littoral zone and the position of the staging and goal waypoints. shown in Figure 3.1. An optimum path is then calculated by the algorithm from the staging point to the goal and will take into account perceived obstacles. The active LiDAR detects obstacles that the navigation algorithm will plot course corrections around. The LiDAR is mounted to the robot to account for moving objects such as humans or animals that were not captured in the overhead image. The base path into and out of the water is calculated with computer visualization from an over- head image of the littoral zone. This overhead image can be taken from a multiple of vantage positions such as the top of a deployable pole mounted on the robot or from a flying drone launched from the AAR. For the purpose of this research, the images are captured utilizing a quad-copter flown approximately 15 m directly above the AAR while at the staging waypoint. The image should be captured moments before the navigational path decision is made to ensure successful navigation through

48 the littoral zone. It is outside the scope of this work of how the image is obtained, however, a landing pad for a quad-copter was mounted on the top of the AAR and doubles as the AR code used for the AAR localization in the image as discussed below.

Figure 3.2: Algorithm flow chart.

The proposed path planning algorithm, shown in Figure 3.2, utilizes OpenCV which is an open source library of programming functions written in the Python computer language. The proposed obstacle algorithm identifies perceived ob- stacles when compared to a planed route that allows a clear path into or out of the littoral zone for the robot. A picture taken from overhead of the chosen littoral area

49 will be captured in ideal conditions and will be taken just ahead of the AAR mission time. Ideal conditions for a littoral zone would be a sunny, clear day at noon, with little to no wind or waves. A human will select a waypoint location just outside the littoral zone as the goal for the robot. In ideal conditions this will be a path that the robot can traverse in a straight run. The current state will be compared to the ideal path by the obstacle identification algorithm to ascertain if additional obstacles have obscured the ideal path and if a new path must be generated around the new obstacle(s). Sensors necessary for the algorithm are LiDAR, GPS, wheel encoders, and a water immersion sensor. This sensor suite allows the algorithm to move an amphibious robot through the littoral zone.

Figure 3.3: AR code used to identify the pose of the robot.

The obstacle identification algorithm identifies and plots the robot’s position from the overhead image and then identifies and plots the location of the perceived obstacles as well as their relationship to the position of the robot as shown in Figure 3.2. The algorithm then plots a path out of or into the water. In order to identify the AAR and the pose of the vehicle in the overhead image the algorithm utilizes an AR code

50 Figure 3.4: AR code shown on top of the AAR robot. The code also functions as a launching and landing pad for the quad-copter. mounted on the top of the AAR. This AR code can be seen in Figure 3.3. The AR code can be easily identified in overhead images as seen in Figure 3.4. As stated earlier, this AR code is also used as the landing/launching pad for the quad-copter. The ROS node single board was utilized to identify the AR code and the pose in the image. This AR code location algorithm allows the location and pose of the robot to be found in relation to the image taken by the overhead drone. The ROS node single board is designed to identify and give the pose of an AR code from video. To enable this node to function from a single overhead image an algorithm was written to loop the image into a video. This video allows the single board node to publish the AR

51 Figure 3.5: Pioneer 3-DX robot with AR code mounted on top. code’s pose in the image as shown in Figure 3.5.

Figure 3.6: Keypoints node identification and size classification.

Not all objects in the littoral zone are obstacles to the AAR. The substantial suspen- sion on the AAR is able to traverse items that are 25 cm in size. Items that are larger than this are identified by the algorithm as obstacles for the AAR and are treated as a hazard to navigation. Any obstacle that poses a threat to the AAR accomplishing the navigation goal given is identified within the algorithm utilizing the OpenCV library

52 node Keypoints. The Keypoints ROS node identifies each Keypoints identified obsta- cle with a red circle. The circle captures the approximate location of the obstacle and the volume within the circle is approximately the volume of the obstacle as shown in Figure 3.6.

Figure 3.7: Identification and location of the obstacles by the algorithm.

Testing the algorithm utilizing the OpenCV library node Keypoints and the obstacles identified with a red circle indicated that the geometry of the obstacle was lost and the possibility of the robot coming into contact with an extremity of the object was possible. An example of this can be seen with the Y shaped drift wood log located just left of centre of the image in Figure 3.6. The log in Figure 3.6 is identified with a large circle on the left section and a small identifier circle on the right portion. This example shows the area of the log between the two circles as the inherent problem with using this OpenCV node. Although the node correctly identified the obstacle as an obstacle it incorrectly indicated that there was an area between the two circles that was navigable.

To correct this issue of giving a false positive, an algorithm was created to identify each Keypoints identified obstacle and then to line trace that area of the image. The proposed algorithm line traces all of the obstacles identified on the image and plots the coordinates and the line tracing shape onto the occupancy grid that the AAR creates

53 as seen in Figure 3.7. An inflation zone is also created to give additional space around the coordinates of the identified obstacles so that a suitable navigation window is pre- sented for the robot as indicated with the gray clouds around the red obstacles shown in Figure 3.8. The robot uses the path given by the algorithm as the centre line for the robot to follow. The algorithm path is optimized to give the greatest distance be- tween obstacles. PID control is used to smooth the motion of the robot while following this path to prevent position hunting. Testing with the AAR showed that it was able to maintain its position within 10 cm accuracy of the path proposed by the algorithm.

Figure 3.8: Inflation zones in dark gray shown around the obstacles identified in blue and green. The goal pose is shown with the green arrow and the calculated path through the obstacle field is shown in purple.

The algorithm uses a 2D image for the overhead image instead of video. This min- imizes the computational time needed to evaluate the littoral zone for obstacles. In future work, with a larger on board CPU, a live video feed could potentially be used. The LiDAR scan is taken in 2D in front of the robot. The refresh rate of the LiDAR is 20 to 25 Hertz. This refresh speed is fast enough to capture the movements of items found in a natural setting. Using a 3D point cloud scan generated of the environment

54 by mounting the LiDAR on a nodding head was considered. It was decided that the high amount of data that is produced in a 3D scan and the time needed to process this data would require the AAR to be driven at a slower speed in order to give the system time to process all of the data. On land the LiDAR is able to detect obstacles taller than 1.12 m above the ground. When floating in the water the LiDAR scan is taken 0.48 m above the water. This is still high enough to detect most obstacles floating in the water that pose a problem for the navigation of the AAR. A second complication of using 3D data as opposed to a 2D LiDAR scan is the complexity of changing the height of the LiDAR scanner in the algorithm when the AAR transitions from land to water or vice versa.

3.1 Preliminary Testing

The AAR robotic platform is too large to test inside the Ontario Tech MARS Lab- oratory. For the algorithm to be tested without having to set up the AAR in a littoral zone a simulation needed to be designed. The simulation required all of the functionality of the larger AAR system including a scale robotic platform that would perform in the Ontario Tech MARS laboratory, a scale AR code mounted on top of the robot that could be seen from above, and a LiDAR mounted on the front of the robot. The chosen scaled robotic platform for the simulation was the Pioneer 3-DX robot as seen in Figure 3.9. The Pioneer 3-DX is a learning platform designed and built by parent company OMRON. Mounted externally to the Pioneer 3-DX is a SICK LMS 200 laser rangefinder LiDAR. The position of the LiDAR was mounted at the front of the Pioneer 3-DX and had an unobstructed 270 degree view which was replicated from the setup on the AAR. An AR code was also mounted above the robot and was scaled so that the proportions were representative to that of

55 Figure 3.9: Pioneer 3-DX Robot with SICK LiDAR mounted on top. the dimensions of the Pioneer 3-DX. The Pioneer 3-DX dimensions were incorporated into the algorithm so that it would perform appropriately in the preliminary testing. This setup allowed for testing and adjusting of the algorithm without the difficulty of setting up for the larger AAR robot outside. This allowed for faster setup and testing of the algorithm. The reduced planning and set up of the system allowed for a shorter period between algorithm changes and quicker development.

An industrial mini-ITX motherboard computer running Linux with the ROS Indigo package was used as the on-board computer as seen in Figure 3.10. The computer runs on a Celeron○c G1850, 2.9 GHz, 2C, L3:2M, GT1, 53W chip set. The mini-ITX computer has outputs for three VGA, independent displays, two Sata 6 Gb/s ports,

56 Figure 3.10: AAR mini-ITX computer [68]. and two USB ports. This computer was selected because it is able to run on a 12 volt DC battery supply, has no fan or moving parts, is silent, rugged, shock resistant, produces little heat, and is energy efficient. While running the visualization tool the first full image was produce as seen in Figure 3.11.

To test the algorithm the Pioneer 3-DX robot was operated in a simulated environ- ment as set up in the MARS Laboratory. A complex environment was given for the LiDAR as seen in Figure 3.12. In Figure 3.12, a rudimentary outline of the of the Pioneer 3-DX robot can be seen in the centre of the image outlined in green. A simulation of a littoral zone strewn with obstacles that can be viewed by the LiDAR and obstacles that fall below the LiDAR’s plane of view was simulated as seen on the left of Figure 3.13. An environment with bold colours on the floor was selected to mimic the shoreline of a littoral zone. A map was then created with the algorithm. The algorithm identified the obstacles in the image whose size would create difficulty for the Pioneer 3-DX. The algorithm then performed a line tracing of the identified obstacles and this map was overlaid onto the LiDAR image as seen on the right side of Figure 3.13. Once the map and LiDAR were overlaid onto the map a path for the Pioneer 3-DX was created through the obstacles as seen in Figure 3.13 by the pur- ple line. This path was generated utilizing the gmapping package from ROS. Further testing of the robot using an overhead image and a LiDAR scan be seen in Figure 3.14.

57 Figure 3.11: Image utilizing known position of Pioneer 3-DX robot in office with open door.

3.2 Image Testing

To test the algorithms robustness many littoral zone overhead images were required from around Canada and the world. This involved the testing of over a hundred overhead images to identify obstacles that would pose a threat to the AAR. As the AAR was initially designed for Canadian operation, 44% of all the littoral zone images were the Canadian shorelines and specifically 52% of those images were of Northern Ontario as seen in Figure 3.15. To further test the robustness of the algorithm and to ensure that the algorithm did not have biases to the structure of Canadian littoral zones, a further 56% of littoral zone images were collected from a cross section of var- ied shorelines from around the world as seen in Figure 3.16. In this way the algorithm could be simulated to test variations such as shoreline structure, rock formations,

58 Figure 3.12: Mapping of an office environment with the Pioneer 3-DX shown and inflation zones around the perceived obstacles. water colour, vegetation, and non-organic structures from a variety of locations from around the world.

In order to check the robustness of the algorithm each littoral zone image was graded in a standardized format. This grading was based upon a four digit scoring criteria. Each score is ranked from one to five. The first digit of the four digit score is the number of false positives or perceived obstacles for the AAR that the algorithm did not line trace on land. A score of five would indicate that no obstacles had been missed while a score of one would indicate that more than five obstacles had not been identified when they should have been. The second digit is the number of false positives or perceived obstacles for the AAR that the algorithm did not line trace on water. A score of five would indicate that no obstacles had been missed while a score of one would indicate that more than five obstacles had not been identified when they should have been. The third digit is the perceived quality of the littoral

59 Figure 3.13: An image of the obstacle field in front of the Pioneer 3-DX. Only the obstacle on the left can be imaged with the LiDAR. The map made from the obstacles identified and line traced are shown on the right. zone for the AAR to traverse over land to the water or vice versa easily. A score of five would indicate that the quality of the littoral zone is excellent and a score of one would indicate that the area was poor for a robot to traverse through. The forth digit scores the perceived quality of the image with five indicating the image is clear and one grading the image as poor or out of focus. An image with a result of 5555 would represent a perfect score as seen in Figure 3.17.

Given the ability of the AAR to overcome obstacles on land and in the water and the high safety factor set with the algorithm, a threshold was set for an acceptable littoral zone as one that only had two obstacles on land or in the water miss identified. With the pass/fail criteria set at this level the littoral overhead images in Canada had a 93% pass rate as seen in Appendix A.1 and the images throughout the remainder of the world had a 79% pass rate as seen in Appendix A.2.

60 Figure 3.14: Pioneer 3-DX robot navigating through pylons in the MARS Laboratory.

3.3 Summary

The testing performed with the Pioneer 3-DX allowed effective development of the algorithm to take place. The use of the Pioneer 3-DX significantly reduced the amount of trial time that would have been required in the field. The data gathered from the preliminary testing was applied to the AAR and tested. The image testing of the algorithm verified its effectiveness for a wide range of littoral zone images.

61 Figure 3.15: Littoral zone quad-copter image and the associated algorithm line tracing from Wawa Ontario, Canada. Image by Shutterstock.com.

Figure 3.16: Littoral zone quad-copter image and the associated algorithm line tracing over Lake Retba, Senegal Africa. Image by Shutterstock.com.

Figure 3.17: A grading of 5555.

62 Chapter 4

Prototype System

To properly test the littoral zone path planning algorithm developed, an amphibious robot platform needed to be designed and built. John Krafcik, CEO of Google au- tonomous car company Waymo, stated the following at the 2017 Detroit Auto show, “It was the eminently-quotable computer scientist Alan Kay who said, ’people who are really serious about software should make their own hardware’” [69]. Incorporating the engineering requirements needed for this testing platform and designing a system that accomplishes this objective was required for this research. Creating a robot that has the ability of functioning on land as well as above and/or below water is not necessarily what makes a good amphibious robot design. The goal for this robot was to develop a platform that had the ability to decision make, map, and travel through littoral areas.

The robot operator will lay out a course made up of multiple waypoints for the robot to follow through the littoral area ahead of the mission time. This will be accom- plished moments before the mission by capturing an overhead image of the littoral zone utilizing a drone. While navigating between waypoints the robot will perform obstacle avoidance maneuvers when necessary. These obstacle avoidance maneuvers

63 will be based on decisions such as whether or not to mount an obstacle or to change course and attempt to circumvent it. When course changes are made to avoid ob- stacles, because the obstacle is identified as a hazard, the robot will attempt to get back on course and stay as close to the planned route as possible. This is to mini- mize unnecessary movement, conserve energy and to reduce mission time length. The objective was that the hardware and software for the AAR be interchangeable with other robots and amphibious mobile systems. Conducting mapping and movement through the littoral zone is currently a costly, time consuming, and labour intensive activity.

The developed system must be able to navigate through the littoral zone, be envi- ronmentally friendly, robust enough to operate in natural environments, and be able to operate in water. The littoral navigation system must be able to be remotely controlled if needed to drive it manually when required. This allows an operator to suspend the autonomy of the system and reposition/reset the robot as needed. This specification is especially important in initial set-up and recovery situations. The system must be able to consider the mobile robot it is controlling. This includes vari- ables such as overall dimensions and climbing capabilities so that the system can make appropriate decisions. Finally, the robot must be supplied with an easily accessible, emergency stop device so that the robot can be shut down, in the field, if it poses a hazard to humans or itself.

4.1 Functional Requirements

The design will accommodate terrain consistent with that of Ontario. Typical to- pography of this area includes rock formations, sand dunes, loose gravel, moss/lichen

64 ground cover, tree roots, long grass, swamp/bog, and open water. The robot must be able to gauge and avoid obstructions that are too large to surmount as discussed in Chapter 3. The suspension of the robot must be robust enough to survive multiple impacts with debris of at least four times the force of gravity. The robot must have the ability of running on its own power for a minimum of four hours per day. It must also make its way efficiently on both land and in the water. All on board electronics must be waterproof and or contained within a waterproof enclosure. The unit must be able to sense when it is floating on water and be able to handle 61 cm waves. The robot's waterline when fully loaded must not be higher than the deck of the bow and stern sections of the floatation. The turning radius on land will be the robot's length and in the water it will be within 12 m. The robot must be able to traverse through loose sand/soil and mud. The robot must be able to send and receive communication wirelessly. The robot must navigate autonomously while steering around obstacles and avoiding collisions.

4.2 Physical Requirements

The robot must be able to traverse obstacles as large as 33 cm in height and have a minimum clearance of 49 cm. The suspension must allow for climbing 25 cm obstacles located on a 20◦ upward slope at an approach angle of 45◦. The vehicle's centre of mass must allow the robot to be tilted in any direction of up to 45◦ without overturn- ing. All robot systems including the drive train, suspension, chassis, external sensors, waterproof enclosures, servos, actuators, motors, antennae, linkages, gear assemblies, water propulsion units, and floatation must be able to sustain continuous underwater immersion for a minimum of one hour.

65 The concept generation of this project began with an in-depth literature review into existing designs in the fields of amphibious mechanisms, autonomous robotics, under- water vessels, land-based off-road systems, different forms of locomotion, navigation systems, sensors, suspension, and different types of drive systems. The engineering specifications were correlated with the above requirements to ensure that the final design met all requirements. With all of this preliminary work completed, the design process becomes a living, organic, decision making process. Changes from the first to the last iteration show a gradual evolution towards the final chosen design.

4.3 First Concept

For the next stage of the project, a revolutionary water propulsion drive system was utilized and tested in a robotic boat. The robot’s propulsion system was based off of a model of the Voith Schneider Propeller [70], as shown in Figure 4.1. The first iteration of the boat, to house the propulsion system, was fabricated out of Plexiglas○c . This was done to allow the study of the drives in action and monitor the fluid dynamics. This design was named the Borg-Voith One as shown in Figure 4.2. Unfortunately the initial layout had too low of a waterline and quickly took on water when thrust was initiated. The test showed that when thrust commands were given remotely the unit pitched forward and lowered the gunnel of the vessel below the waterline. When tested, the components became wet and quickly had to be dried out to avoid electri- cally shorting the system.

66 Figure 4.1: Voith-Schneider propeller [70].

4.4 Second Concept

The second concept of the Borg-Voith design was called Borg-Voith Two and uti- lized the high sides and lid of a commercially available, semi-opaque tote that was modified to allow for the twin drives to be mounted, as shown in Figure 4.3. This second iteration of the initial Voith propulsion design was water tight and allowed for a considerable amount of research to be performed on the capabilities of this type of drive system and its ability to drive an amphibious robot through the water. In the new housing the positive aspects of the robot were obvious and it was found to be in-

67 Figure 4.2: Prototype Borg-Voith One with twin Voith-Schneider propellers. credibly agile and able to make slight course corrections to maintain a station-keeping position. The gyroscopic hover capability was very effective against current and wave action.

After the data was examined it was found that this type of drive for an amphibious robot has three critical shortcomings. The first weakness the Voith Schneider Pro- peller has is that it was sensitive to surface and submerged debris like flotsam and jetsam in fouling the propellers. Debris that is elongated, such as weeds or grass, has a tendency of becoming entangled around the spinning Borg-Voith Two rudders. The propellers perform best in unobstructed, open water which, for an amphibious robot, is unlikely to occur. The key to an amphibious system is that it must be flexible in multiple mediums and environments. This type of drive is not flexible because it entangles easily and is near impossible to recover without human inter- vention. The second limitation with the Voith Schneider Propeller mounted on an amphibious robot was that a unit scaled for this type of robot would need to have an

68 Figure 4.3: Prototype Borg-Voith Two with higher gunnel. unobstructed draft of 76 cm. This deep draft would severely limit the use of both the terra based drive and the Voith Schneider Propeller at the same time when entering or exiting the littoral zone. The loss of utilizing both land and water drive systems simultaneously would make for entering and egressing the littoral zone difficult. The third limitation with this drive is the high amount of energy needed to constantly spin the rudders. Due to these concerns, the Voith Schneider Propeller was not considered for this project.

4.5 Third Concept

The next design iteration was the first AAR prototype, as shown in Figure 4.4. This was designed by the 2013/2014 Ontario Tech Capstone Design Team and was designed and built in collaboration with the author of this dissertation [71]. The philosophy of this design focused on the importance of floatation through the use of a displacement type mono hull. This design has four arms with which the wheels were mounted upon.

69 Figure 4.4: Designed and built by the 2013/2014 Ontario Tech Capstone Design Team [71].

Suspension, drive motors, navigation, power generation, hydrogen tank, electronics, and communication were all housed inside the open hull. The hydrogen gas was used to fuel an on-board, 200 watt output fuel cell. This fuel cell was selected to charge the on-board battery system during rest periods when the robot was not in motion. The complexity of the drive/suspension system, the weight of a suspension system which was fabricated out of carbon steel, the use of aluminum for drive shafts, and the lack of correct tolerances ultimately limited the testing of this platform because it translated into poor performance during field tests.

4.6 Fourth Concept

The next design model has wheels mounted on a legged, suspension system that allows the legs to be raised flush with the electrical storage container used as floatation. The first Computer Aided Design (CAD) layout has the tires used as floatation, as shown in Figure 4.5. Each wheel adds 7 kg of floatation [72], thus giving the design 28 kg

70 of floatation. The design incorporates a mechanism to raise the wheels level with the body to allow the centre of buoyancy to be lowered for stability in water. A difficulty with this design was the complexity that such a wheel raising mechanism and suspen- sion system would create. An additional problem with this type of suspension system is the difficulty in deploying and recovering the wheels while entering and egressing into the water.

4.7 Fifth Concept

Figure 4.5: Pivoting suspension system for adjustable centre of buoyancy.

The first design of a deployable float, as shown in Figure 4.6, has the floatation extend down to the waterline mechanically to add to the floatation and stability the tires al- ready provide. The basis of this design was to add floatation without compromising the suspension system. The main concern with this design was with the loss of clear- ance between the ground and the robot's undercarriage while the float is deployed.

71 Figure 4.6: Amphibious robot design with deployable ridged floatation.

4.8 Sixth Concept

The second layout, of the deployable floatation design theme, consists of a torpedo shaped float located down the centre line of the robot. This float lowers pneumatically down to the waterline to assist the floatation of the wheels, as shown in Figure 4.6.

Figure 4.7: Amphibious Robot Design with deployable torpedo shaped floatation.

The disadvantage of the amphibious robot design shown in Figure 4.7 with the de- ployable torpedo shaped floatation is that the system requires additional mechanical systems to actuate the torpedo floats. The other disadvantage of this system is the reduction of clearance under the robot.

72 4.9 Seventh Concept

Figure 4.8: Lego○c bogie suspension test model built by Dr. Scott Nokleby [73].

The concept of the rocker bogie suspension that is currently used on the Mars rovers, was discussed with Dr. Scott Nokleby and the 2014/2015 Ontario Tech Capstone De- sign Team as a possible suspension candidate. To test the validity of this design for the application of an amphibious robot Dr. Nokleby built a test model out of Lego○c as shown in Figure 4.8. The flotation for this design utilizes a pair of saddlebag, deployable, torpedo shaped, floatation outriggers as shown in Figure 4.9. The main advantage of this design is the sizable amount of stability of the robot when in the water and the significant amount of floatation that is generated from the dual pon- toons. These two benefits are achieved while not compromising the ground clearance

73 Figure 4.9: Rocker bogie suspension with torpedo shaped floatation. on land or the overall foot print since the pontoons are designed to retract above the robot when transitioning on land. The main disadvantage of this design lies in the level of complexity of deploying and recovering the floatation torpedoes from above the robot. The mechanics and clearance around the robot required for such a maneu- ver would be difficult in a forest or low lying brush or foliage. The mechanism would run a high degree of probability of suffering from jamming or damage from debris in the littoral zone. Storage of this type of torpedo floatation system would also be problematic since storing it above the robot would increase the height significantly and raise the centre of gravity on land. Difficulty would also be multiplied from the need of adding a locking system that would be required to hold the floats up in place. The probability of the lock becoming jammed would compromise the mission since the robot would not be able to continue on a mission with one or both floats not

74 actuating correctly. A significant amount of power would also be required to lift the wet floats up into the stored position.

4.10 Eighth Concept

Figure 4.10: Weighted rear wheel and limited rotation of rear bogie concept.

After determining that the fender/pontoon design posed too large a threat of becom- ing fouled, the next design iteration considered was to weight the rear wheels which would pre-dispose them to be in a position that would anticipate the shoreline. This positioning of the rear bogie would remove the need to hold the suspension with a fender system. The idea for the design was to weight the rear wheels with water

75 or antifreeze so that the floatation of the centre wheels would tilt the bogie at such an angle that the wheels would be prepared to exit the water as it approached the shoreline. The bogie would also have a physical limiter on it so that there would be a solid limit to the rotation of the bogie as shown in Figure 4.10.

4.11 Ninth Concept

Figure 4.11: Flotation and water propulsion added to bogie design.

To reduce the complexity of the floatation system and to simplify the propulsion ar- rangement in the water the design in Figure 4.11 was proposed. This flotation design incorporates many of the previous designs including a bogie suspension system with limited rear bogie rotation, dual independent water jet drive system, and a weighted rear wheel to encourage an aggressive wheel position while in the water in anticipation of reaching the river or lake bottom or approaching the shoreline.

76 4.12 Tenth Concept

Figure 4.12: Ontario Tech Capstone Design Team Final Design [74].

Discussion and planning with the 2014/2015 Ontario Tech Capstone Design Team led to the CAD design shown in Figure 4.12. The design incorporates the wheel drives encapsulated in curved rocker bogie legs and multiple layered storage compartments.

4.13 Eleventh Concept

This design for the AAR, incorporates a single watertight electronics compartment as shown in Figure 4.13. This design preserves the suspension model of previous iter- ations except it now incorporates the leg design to encapsulate the drive which was chosen which were IronHorse, right angle, DC gear motors, 24 VDC, 23 RPM, 32 Nm

77 Figure 4.13: Eleventh concept with key components itemized. 1) wheels, 2) rocker bogie legs, 3) flotation hull, 4) GPS, 5) hydrogen tank, 6) axles, 7) torsion bar, 8-15) drive control, 16) jet drive motor, 17) jet drive full load torque, 82:1 gear ratio, with a 16 mm diameter shaft. To compensate for the additional 13.6 kg of weight in the back of the robot from the additional two drives, additional displacement floatation was placed in the stern of the vehicle.

4.14 Proof of Concept Testing

To test the rocker bogie suspension design and its acceptability as a valid suspen- sion system, Dr. Nokleby built a test model out of Lego○c as shown in Figure 4.8. The bogie suspension system concept came from a brainstorming session with the 2014/2015 Ontario Tech Capstone Design Team. This physical prototype allowed the team to discuss different layouts and scale ratios. Limitations of obstacle avoidance were realized and undercarriage clearance was addressed. The hyper rotation of the

78 Figure 4.14: Half-scale prototype built by the 2014/2015 Ontario Tech Capstone Design Team. rear bogie assembly became apparent with the test build and was addressed with stops to limit the rotation. Further testing was accomplished with a half scale model built by the 2014/2015 Ontario Tech Capstone Design Team. The half-scale test prototype was built, as shown in Figure 4.14, using PVC pipe, closed cell foam, and aluminum extrusion [74]. This model was built to examine how the suspension design would operate over obstacles, how the bogies would suspend under the vehicle while moving through water, how the wheels would add to the floatation, how stable the unit was in and out of the water, and how much additional foam floatation would be necessary to raise the centre of buoyancy of the robot to counteract the weight of the proposed drives. The half-scale model was pulled through a pond and studied for how it reacted.

79 4.15 Concept Selection Criteria and Selected Con-

cept

The simplicity of a design tends to promote robustness. Striking a balance between minimizing the number of moving components with the redundancy of critical sys- tems is the cornerstone of a superior design. In many factories a robust design is one that has little associated downtime. This approach was utilized in the chosen robot design. Criteria such as weight, size, power consumption, were all considered during the design of this robot. A good design will allow for preventative maintenance and an indication or warning before any catastrophic failure or shutdown. In the case of an autonomous robot, the trait of being resilient and durable must be balanced with criteria that includes such things as weight, size, power consumption, the ability to localize, and the capability of continuing on the mission after the failure of one or more subsystems. One of the most difficult environments that an amphibious robot must traverse is what is known as the littoral zone. The littoral zone tends to have the most difficult obstacles to surmount. This area is often made up of uneven ground, large obstacles such as rocks and logs, and is subject to waves, tidal surge, and loose ground. A well designed amphibious robot has the ability of traversing the littoral zone efficiently with both the terrestrial based drive and the water based propulsion. A good design will also be less prone for these types of drive to become entangled with debris and resist becoming ensnared. The intention of this robot's design is to have the capability of surmounting land based obstacles as well as withstanding waves from any direction. Data being gathered by the robot and broadcast to the base station will include on-board systems status, GPS positioning, and mapping data. Data to the robot, from the base station, will include waypoint updates, changes to course direction, and emergency shutdown of systems.

80 Figure 4.15: Full scale model built by the 2014/2015 Ontario Tech Capstone Design Team [74].

The selected concept was Concept Eleven. The prototype of the AAR as shown in Figure 4.15. This prototype was built by the author and the 2014/2015 Ontario Tech Capstone Design Team .

4.16 AAR Prototype

An overview of all of the AAR components are shown in Figure 4.16. Note that the prototype in Figures 4.15 and 4.16 are the same, but the hull was repainted between images. Not included in Figure 4.16 is the LiDAR. In April of 2015 testing of the land based propulsion began in a tele-operated manner. This testing included climbing obstacles as shown in Figure 4.17 and inclines as shown in Figure 4.18. During the

81 Figure 4.16: Component Layout of the AAR. summer of 2015 extensive water impingement testing of individual components was undertaken to ensure the robot did not take on water. In November 2015 the AAR was successfully tested in Lake Ontario in a tele-operated manner. Testing in the lake included traversing through the littoral zone as shown in Figure 4.19, transitioning from land to water propulsion as shown in Figure 4.20, testing for water leaks, and operating the jet drives as shown in Figure 4.21.

The AARs’s ability of entering the water under remote control, stability of flotation, starting and operating both water jet drives, steering, and the ability to exit the water without assistance was proven. The water immersion sensor used to engage the jet drives was tested. This sensor is located beside the jet drive jet at the back of the robot. Testing of the tank style steering of the jet drives was performed and shown

82 Figure 4.17: AAR climbing obstacles. in Figure 4.22. During the winter months of 2016 the AAR land drive systems were tested in sub-zero temperatures as shown in Figure 4.23.

4.17 Summary

The AAR prototype is a novel robotic platform, with amphibious capabilities, able to test the autonomous algorithm to navigate between waypoints in a natural environ- ment. It was found through the literature review that an amphibious robot with the size and power to surmount substantial natural debris was not available. A robot that has the ability to operate for a prolonged period of time, without human intervention is also underrepresented. A robot with the AAR’s climbing capabilities, navigation, wheel and jet drive redundancy, and ability to traverse through a littered littoral zone will bring new understanding to the difficulties currently experienced by robots

83 Figure 4.18: AAR climbing an incline. performing duties in natural environments. The AAR brings the integration of land and water propulsion systems in a water-proof automated platform to the robotics community.

84 Figure 4.19: AAR traversing through the littoral zone.

Figure 4.20: AAR switching from land drive to jet drive.

85 Figure 4.21: AAR under dual jet drive power.

Figure 4.22: Single jet drive steering testing.

86 Figure 4.23: Freezing temperature land drive systems testing.

87 Chapter 5

Experimental Testing and Results

This chapter will describe the steps taken to test the algorithm created for amphibious robots to self navigate through the littoral zone and will present the results achieved. It will do this by executing the algorithm devised to detect and navigate around multi- ple obstacles first on land and then through the littoral zone. Although the algorithm will operate for different amphibious robots, the testing was performed with the am- phibious robot, the AAR, as described in Chapter 4. An operator will supply two waypoints, one on land and one in the water. Both of these points will be just outside of the littoral zone. The first waypoint is on land and is located approximately 6 m above the high water line and the second waypoint is located in the water approxi- mately 4 m past the waterline.

5.1 Land Results

To conduct land testing a series of trials were set. Testing began indoors with the Pioneer robot as shown in Figure 3.13. This testing was successful and able to show the robot navigating using a map created from an overhead image overlaid with a second map created from other sensor data such as LiDAR. With the success of the

88 Figure 5.1: AAR operating in snow.

Pioneer robot the algorithm was scaled up for the AAR. The outdoor testing for the land trials focused on environmental variables. These included high winds, tempera- tures between 30°C and -20°C, snow as shown in Figure 5.1, mud, pavement, gravel, grass, rain, and different times of day that the tests were run. The obstacles that the AAR was tested on land included obstacles that are high enough to be detected by the LiDAR as shown in Figure 5.2, mounted 1 m off of the ground, as well as obstacles that were below the line of site of the LiDAR as shown in Figure 5.3. Ob- stacle size and colour variations were also used to test the robustness of the obstacle identification algorithm. All of the above variation of obstacles did not impede the obstacle identifying algorithm from successfully categorizing and identifying obstacles that would pose a problem to the suspension of the robot.

One variable that did pose an issue to the accuracy of the algorithm was the size of the shadows created around objects that were identified as obstacles. These shadows were included with the obstacle as it was identified on the obstacle map and made the obstacle appear larger than it physically was. As testing was conducted over all

89 Figure 5.2: AAR navigating around an obstacle identified by both the LiDAR and the cost map overlay. times of day and all four seasons the lengths of the shadows varied. During testing the additional size that the obstacle was given due to its shadow was not found to be a concern since the additional size simply acted as an inflation zone around the obstacle. The shadow issue did not create any significant problem while testing since short obstacles do not throw long shadows and tall obstacles don’t tend to be mobile.

Due to the size and complexity of moving the AAR to different test sites a custom trailer was constructed to transport the robot. The complexity of transporting the robot however made it important to find a local venue to conduct land testing with a littoral zone. This became more complex on July 1, 2020 when enhanced Canadian Regulations came into effect limiting the areas in which drones weighing more that 200 g could be flow as shown in Figure 5.4. A location in South Whitby, Ontario just outside the no fly zone was chosen beside a water reservoir called Whitby South Shores. This location was just outside the no fly zone so it did not require any addi- tional permission for a licensed drone operator to fly within.

90 Figure 5.3: Result of operating the AAR with an obstacle below the LiDAR line of sight without cost map overlay.

The first land test was performed on August 31, 2020. Overhead images of the ob- stacle field and AAR were captured by the UAV and loaded into the algorithm. The obstacle map was created and loaded with the LiDAR created map. Issues were found with the orientation of these two maps. A manual rotational correction was made to the obstacle map to correct the misalignment. The AAR failed to initiate any move- ment when a goal command was given on the far side of the obstacle field. When the robot failed to initiate any movement and produced no error code all other testing was suspended for the day.

Study of why the AAR would not initiate to the goal on the first land trial revealed that the rotational correction was supplied in degrees and not radians as the algorithm expected. Since the algorithm expected a specific unit of measure when the incor- rect one was supplied the algorithm simply paused. A second attempt was made on September 10, 2020 and the overhead images from the first land attempt were reused.

91 Figure 5.4: Canadian National Research Council Drone restriction zone map.

This allowed the test to be duplicated ensuring the problem correction was the only variable tested. Reusing the overhead image also reduced the time between tests. The obstacles positions were duplicated to the locations of the first land attempt. The robot navigated around the obstacles correctly as the algorithm and navigational program instructed as shown in Figure 5.5.

A third land test was performed on September 16, 2020 to confirm the findings of the successful second land trial. On the third test new overhead images were used and a different layout of obstacles was chosen as shown in Figure 5.6. This was done to validate the successful findings of the second test. During the third land test the AAR correctly identified the obstacles and navigated a safe path around them as shown in Figure 5.7. The third test was also successful with the AAR navigating around the correctly identified obstacles as instructed by the navigation algorithm. The AAR was able to reach the goal set by the operator as shown in Figure 5.8. The land testing

92 Figure 5.5: AAR at the goal position at the completion of the second land test. was considered a success after the AAR was able to constantly attain the location of the goal after multiple tests.

5.2 Water Results

A similar set of trials as the land testing were used for the water testing with the ad- dition of the transition into water. The environmental variables included high winds, temperatures between 30°C and 8°C, pavement, gravel, and water. The obstacles, similar to the land testing, included obstacles that are high enough to be detected by the LiDAR as well as below the LiDARs line of sight. The obstacles were also different colours, shapes, and sizes.

The water testing location chosen was within the restricted no fly zone maintained by NAV Canada. An authorization letter detailing the date, time, specifications of the drone, pilot, and flight plan were required as documented in Appendix A.3 for authorization to fly within this zone. Approval was granted for a specific time and flight pattern window.

93 Figure 5.6: Third land test overhead image (obstacles added to the environment are indicated).

The first water test took place on September 24, 2020. The Port Whitby Boat Launch area is paved. This surface made tank steering difficult. Earlier testing was com- pleted on a more compliant surface such as gravel and grass. The wheels on the robot gripped exceptionally well and resisted turning. Although the algorithm plotted a course around the obstacles the six driven wheels of the AAR had a such a solid grip on the asphalt that the wheels would not allow the robot to turn as directed by the navigational algorithm. Since the robot was unable to steer correctly the testing for the session was ended.

The second water test took place on October 1, 2020. A solution for better control of the steering was devised by disconnecting the drives to the two front wheel axles as

94 Figure 5.7: Third land test with obstacles and navigational path shown in RVIZ (objects in the RVIZ screenshot are indicated). well as raising the rear wheels off of the ground by pinning the rear suspension. This allowed the AAR to pivot around the centre wheels. The centre drives are the only pair of wheels with encoders. The obstacles for the AAR were positioned as shown in Figure 5.9, and the drone was launched 15 m directly above the robot to capture the overhead littoral zone image. The image was loaded into the obstacle identification algorithm to produce the obstacles chosen to be avoided as shown in Figure 5.10. This obstacle identification image is overlaid with the GPS map, LiDAR map, and the AAR localization as shown in Figure 5.11. The ROS visualization program RVIZ was used to set the goal for the AAR and set from a remote computer located 10 m away.

The initial test of the second water test was a success. The AAR navigated a path

95 Figure 5.8: AAR at the goal position at the completion of the third land test. around all of the obstacles, entered Lake Ontario, and engaged the jet drives. As the robot navigated near objects identified by the LiDAR’s input the robot’s position based on the encoder feedback was corrected.

When the AAR jet drives engaged the current draw on the batteries was in excess of the 40 Amp fuse in series and failed. In addition to the jet drive blown fuse it was noted after the robot was driven out of the water that the middle Starboard axel had broken and was no longer driven as shown in Figure 5.12. Although further tests were attempted it was found that the broken axel did not allow the robot to turn correctly. It was also noticed that the strain induced by locking the suspension to raise the rear wheels had also sustained damage from the stain of testing as shown in Figure 5.13. This damage precluded further testing to be conducted without significant redesign of the robotic platform.

Further repair and testing was unnecessary since the results of the water testing

96 Figure 5.9: Layout of obstacles for the water test. showed that the algorithm consistently and correctly identified the obstacles, plotted a course around them, entered the water and engaged the water propulsion system.

97 Figure 5.10: Algorithm output of the littoral zone with obstacles identified.

Figure 5.11: RVIZ output showing overhead obstacles identified with LiDAR map overlay.

98 Figure 5.12: Damage sustained to the Starboard, centre axle.

Figure 5.13: Damage sustained to the Port suspension locking angle.

99 Figure 5.14: AAR entering Lake Ontario with jet drives engaging.

100 Chapter 6

Conclusions and Recommendations for Future Work

6.1 Conclusions

The research conducted successfully demonstrated the effectiveness of the algorithm created for an amphibious robot to plot and navigate an obstacle free path through a littered littoral zone. The motivation to design and develop this algorithm was from a lack of research in the field of amphibious robotics and the complexity of navigat- ing in a natural, obstacle rich environment. The algorithm was successfully tested in simulation and with different robotic platforms, indoors, outdoors on land, as well as in Lake Ontario. The algorithm utilizes a large AR code mounted on top of the robot to determine the position of the robot from the image taken by the quad-copter. By calculating the angle and size of the AR code the pose of the robot is calculated by the algorithm as well as the position of the obstacles. The algorithm identifies obsta- cles based from developed filters. The algorithm then outputs an image with those obstacles line traced, with their relative position to the AR code, via an occupancy grid. The algorithm amalgamates this cost map with the current LiDAR data and

101 determines the best path through any obstacle field to the navigational goal set by the operator outside the littoral zone. Testing of the algorithm on land and with water shows the effectiveness of the algorithm. The final test of the AAR in Lake Ontario selected a route from a waypoint on land just outside the littoral zone, plotted a path that presented the least amount difficulty around multiple obstacles identified by the algorithm from an image captured by a quad-copter, and navigated the plotted course through the littoral zone.

The results shows that the autonomous system algorithm proposed can be integrated into any existing amphibious, robotic platform making it autonomous and allowing it to operate in and through the littoral zone.

6.2 Recommendations for Future Work

Future work for the developed algorithm is to continue testing and making improve- ments to the adaptability of the algorithm with other amphibious robotic systems. This testing could include research robots such as the amphibious Clearpath robot Warthog. With the ultimate failure of the tank steering of the AAR, a redesign and rebuild of this drive system would allow testing to continue with this robot.

Further automation of the quad-copter image capturing system would reduce manual operator input of the image into the algorithm. Automatically sending the image elec- tronically from the flying drone to the amphibious robot would significantly reduce the time currently needed to perform this task manually after the drone has landed. This time reduction would further reduce the opportunity for the current state of the littoral zone to change before a cost map of the area could be created and navigated through by the robot. Electronic transfer of this image file could be accomplished

102 utilizing the current WiFi system that the robot utilizes. The quad-copter could con- tinue to take images and update the cost map with the robot’s position throughout the passage through the littoral zone. The quad-copter could then rejoin the amphibi- ous robot after it completed traversing the littoral zone.

It is also recommended to run live video from the drone of the littoral zone for the algorithm. This would allow the robot to be aware to changes to the littoral zone while the robot was moving through it. The substantial increase of data that live video would introduce would require a significant upgrade to the current on board processor. This upgraded computer would need to be powerful enough to handle the significant amount of data received from the video to allow the amphibious robot to calculate a path through the obstacles and still move at a reasonable speed.

Further testing of the algorithm in other amphibious robots such as the Warthog built by Clearpath Robotics [52] is recommended. This would allow testing on the robust- ness of the algorithm in amphibious robots with different land and water propulsion systems.

The LNS algorithm already has the ability of identifying, plotting, and navigating around obstacles in the littoral zone water. Currently the AAR does not steer well in the water. Adding accurate steering to the AAR could be accomplished in two ways. First, the addition of a water thrust vectoring nozzle to the output of each of the two jet drives would allow improved steerage. The positioning of these thrust vectoring nozzles could be accomplished with the two nozzles driven by a single servo or by individual servo drives. Secondly, the AAR could be steered by varying the motor speed of the jet drive which would vary the water thrust pressure which would also provide steerage.

103 Based on the difficulty experienced in designing an amphibious robot capable of han- dling large natural obstacles in the littoral zone it is recommended to redesign the AAR to be a more robust amphibious platform. The key features requiring redesign based on the experiences gained through the testing process would be to focus on the redesign of the land and water steering of the AAR. It is also recommended that these redesigned systems incorporate substantial upgrades to the water impingement of the robot.

It is also recommended to utilize a 3D LiDAR point cloud of the littoral zone from the front of the amphibious robot. This could be achieved with the addition of a nodding head to the existing LiDAR unit. The addition of this 3D view would enhance the sit- uational awareness of the robot in the form of a larger field of view. Concerns for the higher amount of data that would require processing from the additional 3D data are similar to those explored for adding a live video stream from the quad-copter. The amphibious robot would need to address the significant amount of additional data input from the 3D LiDAR point cloud. This would require higher processing speed from the on-board computer to handle the required computing demand. A further consequence of a 3D LiDAR point cloud would be having to train the algorithm to recognize the dynamic effect of the water’s edge.

104 References

[1] “IEEE Sectrum unlucky robot gets stranded inside fukushima nuclear reactor, sends back critical data,” http://spectrum.ieee.org/automaton/ robotics/industrial-robots/robot-stranded-inside-fukushima-nuclear-reactor/ ?utm source=roboticsnews&utm medium=email&utm campaign=042815, accessed: 2015-04-13.

[2] “Office of Naval Research the future is now: Navys autonomous swarmboats can overwhelm adversaries,” http://www.onr.navy.mil/en/Media-Center/ Press-Releases/2014/autonomous-swarm-boat-unmanned-caracas.aspx, ac- cessed: 2014-10-30.

[3] Donaldson P., “Modern day marine focus, amphibious vehicle powertrains,” Mil- itary Technology, pp. 135–136, September 2013.

[4] “Arms Control Association pentagon asks more for au- tonomous weapons,” https://www.armscontrol.org/act/2019-04/news/ pentagon-asks-more-autonomous-weapons, accessed: 2019-04-17.

[5] “Royal Navy tidal thames trials for defences new maritime test bed,” http://www.royalnavy.mod.uk/news-and-latest-activity/news/2016/september/ 05/160905-bladerunner, accessed: 2016-09-18.

[6] “Lockheed Martin from under the sea to up in the air: Lockheed martin conducts first underwater unmanned aircraft launch from unmanned underwater vehi-

105 cle,” http://www.lockheedmartin.com/us/news/press-releases/2016/september/ 160928-rms-first-underwater-unmanned-aircraft-launch-from-unmanned-underwater-vehicle. html, accessed: 2016-09-06.

[7] “Defense News navy to kick off extra large uuv competition this month,” https://www.defensenews.com/digital-show-dailies/surface-navy-association/ 2017/01/10/navy-to-kick-off-extra-large-uuv-competition-this-month/, ac- cessed: 2017-01-04.

[8] “Defense News to compete with china, an internal pen- tagon study looks to pour money into robot sub- marines,” https://www.defensenews.com/naval/2020/06/01/ to-compete-with-china-an-internal-pentagon-study-looks-to-pour-money-into-robot-submarines, accessed: 2010-06-06.

[9] Sokan T.; Santhakumar M., “Investigations on the dynamic station keeping of an underactuated autonomous underwater robot,” India Institute, Tech. Rep. 1, 2012.

[10] Higashimori, M.; Kaneko, M., “Omni-paddle: Amphibious spherical rotary pad- dle mechanism,” in 2011 IEEE International Conference on Robotics and Au- tomation, June 2011.

[11] Li Y.; Shimizu E.; Ito M., “Development of under water-use humanoid robot,” Artif Life Robotics, pp. 469–472, October 2012.

[12] B. B., “Sea-Dragon: An Amphibious Robot for Operation in the Littorals,” in 41st Southeastern Symposium on System Theory, January 2009.

[13] Sun, Y.; Ma, S., “epaddle mechanism: Towards the development of a versatile,” in 2011 IEEE/RSJ International Conference, May 2011.

106 [14] Bonin-Font, F.; Ortiz A.; Oliver G., “Visual navigation for mobile robots: A sur- vey,” Springer Science + Business Media B.V., p. Palma deMallorca, September 2008.

[15] Chatchanayuenyong, T.; Parnichkun, M., “Neural network based-time optimal sliding mode control for an autonomous underwater robot,” Mechatronics, vol. 1, no. 3, pp. 471–478, 2006.

[16] Utkin, V., “Sliding mode control,” Control Systems, Robotics and Automation, vol. 2, no. 8, pp. 8–16, 1997.

[17] Valerio, D., “Introducing fractional sliding mode control,” IL Encontro de Jovens Investigadores do LAETA FEUP, Porto, vol. 3, no. 5, pp. 10–11, 2012.

[18] Santhakumar, M.; Asokan, T., “Investigations on the hybrid tracking control of an underactuated autonomous underwater robot,” India Institute, Tech. Rep. Advanced Robotics, 2010.

[19] “Control Solutions Inc. Minnesota anatomy of a feedback control system,” http: //www.csimn.com/CSI pages/PIDforDummies.html, accessed: 2015-02-03.

[20] Medri E., “Interview,” 2014.

[21] Wang A.; Ames M., “”multi-objective compositions for collision-free connectivity maintenance in teams of mobile robots”,” in 55th IEEE Conference on Decision and Control, ACM, Ed. ACM Press, nov 2016, pp. 1–6.

[22] W. Zhan, Simultaneous localization and mapping : exactly sparse information filters. World Scientific, 2011.

[23] Durrant-Whyte, H.; Bailey, T., “Simultaneous localization and mapping: Part 1,” IEEE Robotics and Automation Magazine, vol. 2, no. 7, pp. 99–108, 2006.

107 [24] Montemerlo, M.; Thrun, S.; Koller, D.; Wegbreit, B., “Fastslam: A factor so- lution to the simultaneous localization and mapping problem,” Carnegie Mellon University, Tech. Rep. 1, 2003.

[25] Bailey, T.; Durrant-Whyte, H., “Simulataneous localization and mapping (slam): Part ii,” IEEE Robotics and Automation Magazine, vol. 1, no. 1, pp. 108–117, 2006.

[26] Esparza-Jimnez, J.; Devy, M.; Gordillo, J., “Navigation system of an unmanned boat for autonomous analysis of water quality,” Elektronika ir Elektrotechnika, vol. 2, no. 5, pp. 3–7, 2013.

[27] Tuna, G.; Arkoc, O.; Koulouras, G.; Portirakis, S., “Simultaneous localization and mapping (slam): Part ii,” Sensors, vol. 6, no. 3, pp. 108–117, 2016.

[28] “Business Insider pentagon: China will return us navy underwa- ter drone seized in south china sea,” http://www.businessinsider.com/ china-will-return-us-navy-underwater-drone-2016-12, accessed: 2016-11-30.

[29] Giguere, P.; Dudek, G.; Prahacs, C.; Saunderson, S.; Sattar, J.; Torres-Mendez, L.; Jenkin, M.; German, A.; Hogue, A., “Aqua: An amphibious autonomous robot,” Computer, vol. 7, no. 18, pp. 46–53, 2007.

[30] Giguere, P.; Dudek, G.; Prahacs, C.; Saunderson, S.; Sattar, J.; Torres-Mendez, L.; Jenkin, M.; German, A.; Hogue, A., “Aqua: An amphibious autonomous robot,” Computer, vol. 3, no. 7, pp. 46–53, 2007.

[31] Silva, S.; Cunha, S.; Matos, A.; Cruz, N., “An autonomous boat based synthetic aperture sonar,” MTS, vol. 2, no. 8, pp. 20–27, 2007.

[32] Divya, V.; Dharanya, S.; Shaheen, S.; Umamakeswari, A., “Amphibious surveil- lance robot with smart nodes,” Indian Journal of Science and Technology, vol. 1, no. 9, pp. 4496–4499, 2013.

108 [33] Madhuri, V.; Umar, S.; Veeraveni, P., “A study on smart dust (mote) technol- ogy,” IJCSET, vol. 3, no. 3, pp. 124–128, 2013.

[34] “Defence Industry Daily underwater gliders for the us navy,” http: //www.defenseindustrydaily.com/Underwater-Gliders-for-the-US-Navy-06990/, accessed: 2011-07-14.

[35] “Autonomous Undersea Vehicle Applications Center advanced search auv,” http: //auvac.org/explore-database/advanced-search, accessed: 2016-11-15.

[36] “Robohub unmanned and underneath: Stay in the know about underwater drones,” http://robohub.org/ unmanned-and-underneath-stay-in-the-know-about-underwater-drones/, ac- cessed: 2016-11-23.

[37] “Defence Industry Daily underwater gliders for the us navy,” http://pena-abad. blogspot.ca/2011/07/underwater-gliders-for-us-navy.html, accessed: 2011-07-30.

[38] Ragi, S.; Tan, C.; Chong, E., “Guidance of autonomous amphibious vehicles for flood rescue support,” Mathematical Problems in Engineering, vol. 3, no. 8, pp. 1–9, 2013.

[39] E. Erdelj, M.; Natalizio, “Uav-assisted disaster management: Applications and open issues,” in 2016 International Conference on Computing, Networking and Communications (ICNC), 2016, pp. 1–5.

[40] Yu, K.; Budhiraja, A.; Buebel, S.; Tokekar, P., “Algorithms and experiments on routing of unmanned aerial vehicles with mobile recharging stations,” Journal of Field Robotics, vol. 36, no. 3, pp. 602–616, 2018.

[41] Subramanian, A.; Gong, X.; Wyatt, C.; Stilwell, D., “Shoreline detection in images for autonomous boat navigation,” Office of Naval Research, vol. 2, no. 7, pp. 999–1003, 2007.

109 [42] “Automated Ships Ltd automated ships ltd and kongsberg to build first unmanned and fully-automated vessel for offshore operations,” http://www. automatedshipsltd.com/, accessed: 2016-11-02.

[43] Junior, S.; Araujo, A.; Goncalves, A.; Silva, M.; Aroca, R., “N-boat: An au- tonomous robotic sailboat,” in IEEE Latin America Robotics Symposium, June 2013.

[44] Cruz, N.; Alves, J., “Autonomous sailboats: an emerging technology for ocean sampling and surveillance,” Oceansys Publications, vol. 6, no. 2, pp. 1–6, 2009.

[45] Stelzer, R., Proell, T., “Autonomous sailboat navigation for short course racing,” Austrian Association for Innovative Computer Science, vol. 7, no. 6, pp. 1–23, 2008.

[46] Menage, O.; Bethencourt, A.; Rousseaux, P.; Prigent, S., “Vaimos: Realization of an autonomous robotic sailboat,” Robotic Sailing 2013, vol. 4, no. 6, pp. 25–36, 2013.

[47] Jaulin, L.; Le Bars, F., “An interval approach for stability analysis; application to sailboat robotics,” IEEE Transaction on Robotics, vol. 4, no. 4, pp. 1–6, 2012.

[48] “University of British Columbia ubc trans atlantic challenge,” https:// ubctransat.com/, accessed: 2016-10-30.

[49] “Johns Hopkins Universitywater and air: Flying fish uaav can go any- where,” http://www.jhuapl.edu/newscenter/pressreleases/2017/170906.asp, ac- cessed: 2017-09-21.

[50] Ramirez-Serrano, A., “University of calgary selects clearpath robotics for har- bor defense research: Networks of unmanned surface vehicles (usvs) for au- tonomous monitoring of waterways reduces risk of marine-based attacks,” Mar- ketwire, vol. 1, no. 2, p. 7, 2010.

110 [51] “Clearpath Robotics clearpath robotics amphibious unmanned ground vehicle robot warthog,” https://www.clearpathrobotics.com/ warthog-unmanned-ground-vehicle-robot/, accessed: 2016-10-30.

[52] “Clearpath Robotics warthog amphibious unmanned ground vehicle,” https://clearpathrobotics.com/warthog-unmanned-ground-vehicle-robot, ac- cessed: 2020-12-17.

[53] Muriel, D.; Cowan, E., “Flapping instability induced in an externally deformed flexible plate,” Fluid Mechanics, vol. 1, no. 2, pp. 466–476, 2017.

[54] Yuh, J., “Design and control of autonomous underwater robots,” Kluwer Aca- demic Publishers, vol. 2, no. 3, pp. 1–6, 2000.

[55] Murphy C.; Singh H., “Rectilinear coordinate frames for deep sea navigation,” 2010 IEEE/OES Autonomous Underwater Vehicles, pp. 1–10, September 2010.

[56] Vaneck, T., “Fuzzy guidance controller for an autonomous boat,” IEEE Control Systems, vol. 2, no. 1, pp. 43–51, 1997.

[57] “Almotive what we do: Ai drive,” https://aimotive.com/what-we-do.jsp, ac- cessed: 2016-11-10.

[58] Cridak S., “Interview,” 2016.

[59] “Fortune.com gm buying self-driving tech start up for more than $1 billion,” http://fortune.com/2016/03/11/ gm-buying-self-driving-tech-startup-for-more-than-1-billion/, accessed: 2016- 11-10.

[60] “Cruise Automation cruise automation,” https://www.getcruise.com, accessed: 2016-11-10.

111 [61] Santana, E.; Hotz, G., “Learning a driving simulator,” Comma AI, vol. 1, no. 1, pp. 1–8, 2016.

[62] “Motor Authority comma.ai founder finds potential loophole for self- driving regulatory issues,” http://www.motorauthority.com/news/1107573 comma-ai-founder-finds-potential-loophole-for-self-driving-regulatory-issues, accessed: 2016-12-03.

[63] “Tesla Motors all tesla cars being produced now have full self-driving hardware,” https://www.tesla.com/blog/ all-tesla-cars-being-produced-now-have-full-self-driving-hardware, accessed: 2016-11-30.

[64] Lindemann, R.; Voorhees, C., “Mars exploration rover mobility assembly design, test and performance,” IEEE Robotics and Automation Magazine, vol. 1, no. 3, pp. 12–17, 2011.

[65] “NASA nasa space science data coordinated archive - spirit,” https://nssdc.gsfc. nasa.gov/nmc/spacecraft/display.action?id=2003-027A, accessed: 2020-05-23.

[66] Shrivastava, S.; Karsai, A.; Aydin, Y.; Pettinger, R.; Bluethmann, W.; Ambrose, R.; Goldman, D., “Material remodeling and unconventional gaits facilitate loco- motion of a robophysical rover over granular terrain,” Science Robotics, vol. 5, no. 42, pp. 3499–3509, 2020.

[67] Lindemann, R.; Bickler, D.; Harrington, B.; Ortiz, G.; Voorhees, C., “Mars exploration rover mobility development, mechanical mobility hardware design, development, and testing,” IEEE Robotics and Automation Magazine, vol. 3, no. 2, pp. 19–26, 2006.

[68] “Mini-Box Computers morex-557 intel haswell dual core celeron 2.9ghz, 100w, rosewill,” http://www.mini-box.com/

112 Morex-557-Jetway-JNC9S-85B-Intel-Dual-Core-Celeron-2-9GHz-100w, ac- cessed: 2016-09-05.

[69] “IEEE Spectrum waymo shows off its home-grown robocar hardware,” http://spectrum.ieee.org/cars-that-think/transportation/self-driving/ waymo-shows-off-its-doityourself-robocar-hardware, accessed: 2017-07-03.

[70] “Westbourne model,” Graupner Electric Motors For Model Boats, 2014. [Online]. Available: http://www.westbourne-model.co.uk/acatalog/ Graupner-Electric-Motors-For-Model-Boats.html

[71] “University of Toronto design and development of power and communications sys- tems for an autonomous amphibious robot (aar-1),” http://www.mie.utoronto. ca/csme2014/documents/UOIT.pdf, accessed: 2014-04-28.

[72] Puccini L.; Mezil A.; Lau B.; Dobrescu O.; Nokleby S., “Capstone brainstorming session number five,” 2014.

[73] Nokleby S., “Interview,” 2014.

[74] Lau B.; Puccini L.; Dobrescu O.; Mezil A., “2014/2015 uoit capstone project chassis group,” 2015.

113 Appendix A

Raw Data

Figure A.1: Littoral Zones from around Canada grading

114 Figure A.2: Littoral Zones from around the world excluding Canada grading

115 Figure A.3: Licence from Nav Canada to fly in drone restricted area

116