University of Research Online

Faculty of Engineering and Information SMART Infrastructure Facility - Papers Sciences

1-1-2020

Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance

Robert A. Gorkin III , [email protected]

Kye Adams University of Wollongong, [email protected]

Matthew J. Berryman University of Wollongong, [email protected]

Sam Aubin

Wanqing Li University of Wollongong, [email protected]

See next page for additional authors

Follow this and additional works at: https://ro.uow.edu.au/smartpapers

Part of the Engineering Commons, and the Physical Sciences and Mathematics Commons

Recommended Citation Gorkin III, Robert A.; Adams, Kye; Berryman, Matthew J.; Aubin, Sam; Li, Wanqing; Davis, Andrew R.; and Barthelemy, Johan, "Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance" (2020). SMART Infrastructure Facility - Papers. 300. https://ro.uow.edu.au/smartpapers/300

Research Online is the open access institutional repository for the University of Wollongong. For further information contact the UOW Library: [email protected] Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance

Abstract While aerial shark spotting has been a standard practice for beach safety for decades, new technologies offer enhanced opportunities, ranging from drones/unmanned aerial vehicles (UAVs) that provide new viewing capabilities, to new apps that provide beachgoers with up-to-date risk analysis before entering the water. This report describes the Sharkeye platform, a first-of-its-kind project to demonstrate personal shark alerting for beachgoers in the water and on land, leveraging innovative UAV image collection, cloud- hosted machine learning detection algorithms, and reporting via smart wearables. To execute, our team developed a novel detection algorithm trained via machine learning based on aerial footage of real sharks and rays collected at local beaches, hosted and deployed the algorithm in the cloud, and integrated push alerts to beachgoers in the water via a shark app to run on smartwatches. The project was successfully trialed in the field in Kiama, ustrA alia, with over 350 detection events recorded, followed by the alerting of multiple smartwatches simultaneously both on land and in the water, and with analysis capable of detecting shark analogues, rays, and surfers in average beach conditions, and all based on ~1 h of training data in total. Additional demonstrations showed potential of the system to enable lifeguard- swimmer communication, and the ability to create a network on demand to enable the platform. Our system was developed to provide swimmers and surfers with immediate information via smart apps, empowering lifeguards/lifesavers and beachgoers to prevent unwanted encounters with wildlife before it happens.

Keywords via, surveillance, alerting, aerial, shark, personal, autonomous, real-time, sharkeye:

Disciplines Engineering | Physical Sciences and Mathematics

Publication Details Gorkin III, R., Adams, K., Berryman, M. J., Aubin, S., Li, W., Davis, A. R. & Barthelemy, J. (2020). Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance. Drones, 4 (2), 18-1-18-17.

Authors Robert A. Gorkin III, Kye Adams, Matthew J. Berryman, Sam Aubin, Wanqing Li, Andrew R. Davis, and Johan Barthelemy

This journal article is available at Research Online: https://ro.uow.edu.au/smartpapers/300 drones

Article Sharkeye: Real-Time Autonomous Personal Shark Alerting via Aerial Surveillance

Robert Gorkin III 1,*, Kye Adams 2 , Matthew J Berryman 1,3,4 , Sam Aubin 5 , Wanqing Li 6, Andrew R Davis 2 and Johan Barthelemy 1 1 SMART Infrastructure Facility, University of Wollongong, Wollongong 2522, Australia; [email protected] (M.J.B.); [email protected] (J.B.) 2 Centre for Sustainable Ecosystem Solutions and School of Earth Atmospheric and Life Sciences, University of Wollongong, Wollongong 2522, Australia; [email protected] (K.A.); [email protected] (A.R.D.) 3 Across the Cloud Pty Ltd., Wollongong 2500, Australia 4 Centre for Bioinformatics and Biometrics, National Institute for Applied Statistics Research Australia, University of Wollongong, Wollongong 2522, Australia 5 Sharkmate, Wollongong 2500, Australia; [email protected] 6 Advanced Multimedia Research Lab, University of Wollongong, Wollongong 2522, Australia; [email protected] * Correspondence: [email protected]

 Received: 16 March 2020; Accepted: 24 April 2020; Published: 4 May 2020 

Abstract: While aerial shark spotting has been a standard practice for beach safety for decades, new technologies offer enhanced opportunities, ranging from drones/unmanned aerial vehicles (UAVs) that provide new viewing capabilities, to new apps that provide beachgoers with up-to-date risk analysis before entering the water. This report describes the Sharkeye platform, a first-of-its-kind project to demonstrate personal shark alerting for beachgoers in the water and on land, leveraging innovative UAV image collection, cloud-hosted machine learning detection algorithms, and reporting via smart wearables. To execute, our team developed a novel detection algorithm trained via machine learning based on aerial footage of real sharks and rays collected at local beaches, hosted and deployed the algorithm in the cloud, and integrated push alerts to beachgoers in the water via a shark app to run on smartwatches. The project was successfully trialed in the field in Kiama, Australia, with over 350 detection events recorded, followed by the alerting of multiple smartwatches simultaneously both on land and in the water, and with analysis capable of detecting shark analogues, rays, and surfers in average beach conditions, and all based on ~1 h of training data in total. Additional demonstrations showed potential of the system to enable lifeguard-swimmer communication, and the ability to create a network on demand to enable the platform. Our system was developed to provide swimmers and surfers with immediate information via smart apps, empowering lifeguards/lifesavers and beachgoers to prevent unwanted encounters with wildlife before it happens.

Keywords: UAV; blimp; shark spotting; machine learning; wearables

1. Introduction Managing shark–human interactions is a key social and environmental challenge requiring a balance between responsibilities of providing beachgoer safety and the responsibility to maintain shark populations and a healthy marine ecosystem. Sharks pose some limited risk to public safety [1]; however, there is a recognized social and environmental responsibility to develop strategies that effectively keep people safe and ensure the least harm to the environment [2]. This process is still evolving, particularly in countries with yearly shark interactions. For instance, in Australia, The Shark

Drones 2020, 4, 18; doi:10.3390/drones4020018 www.mdpi.com/journal/drones Drones 2020, 4, 18 2 of 17

Meshing (Bather Protection) Program NSW [3] is a hazard mitigation strategy in place since 1937 which is known to cause harm to vulnerable species [4] and is listed as a Key Threatening Process by the NSW Scientific Committee with ecological and economic costs [5]. With the range of emerging technologies offering new potential to mitigate harm to people and sharks alike, there are new ways to manage interactions without such high costs. Aerial surveillance has historically been used to help identify sharks in beach zones and warn beachgoers in order to avoid shark incidents. Australia started one of the earliest “shark patrols” known through the Australian Aerial Patrol in the in 1957 [6]; currently, across the country, 1000s of km are patrolled by airplanes and helicopters on any weekend. For decades, aerial patrols have absolutely helped mitigate shark interactions, predominantly by alerting lifeguards/lifesavers to enact beach closures [7], and further are essential for a range of safety and rescue purposes. However, a recent study suggested the ability to reliably detect sharks from planes and helicopters has shown limited effectiveness, with only 12.5% and 17.1% of shark analogues spotted for fixed-wing and helicopter observers, respectively [8]. Though the assessment of such studies has been debated [9], there is certainly agreement that new technologies could assist expansion of aerial shark spotting and communication of threats to lead to better outcomes for humans and sharks alike. The recent expansion of unmanned aerial vehicles (UAVs), often known as drones, offers new abilities to enhance traditional aerial surveillance. In recent years alone, drones have been used to evaluate beaches and coastal waters [10–12] and monitor and study wildlife [13–15], from jellyfish [16], sea turtles [17–19], salt-water crocodiles [20], dolphins and whales [21–24], and manatees [25] to rays [26] and sharks [27–30]. This expansion in surveillance is likely related to the market release of consumer drones with advanced camera imaging; for instance, the DJI Phantom introduced in 2016. Whether rotor-copter or fixed wing, rapid development of commercial drones means more flying “eyes” can be deployed in more locations for conservation and management [31]. Using these methods for shark detection again shows great promise; drones are now a part of several lifeguard programs in Australia in multiple states [32] and are continuously being evaluated for reliability [33]. Other UAVs are also being considered [34]; a review by Bryson and Williamson (2015) into the use of UAVs for Marine Surveys highlighted the potential of unmanned tethered platforms such as balloons with imaging especially over fixed locations [35], and such platforms were successfully tested for shark spotting applications by Adams et al. [36]. These platforms may be able to overcome current issues facing drones: restrictions resulting from limited battery life, limited flight areas per air safety regulations (in Australia this is facilitated by Civil Aviation Safety Authority (CASA) [37]), and the need for pilot training and equipment expertise. Regardless of platform, the use of piloted and, critically, autonomous UAVs in wildlife management will almost certainly expand and offer new opportunities, particularly to complement the use of conventional drones. One critical limitation in traditional aerial surveillance is the need for human engagement with the technology, principally for visual detection. However, again the recent expansion of machine learning/artificial intelligence and in particular Deep Learning tools [38] in the last decade has offered significant advantages as an approach to image classification as well as the detection and localization of objects from images and video sequences. While traditional image analysis relies on human-engineered signatures or features to localize objects of interest in images and performance depends on how well the features are defined, Deep Learning provides an effective way to automatically learn or teach a detector (algorithm) the features from a large amount of sample images with the known location of the objects of interest. This concept of using automated identification from photos/videos has been explored for years (especially from images gathered from camera traps); with the expansion of available drones, automated detection has bloomed as a tool for wildlife monitoring [39,40]. Thus Deep Learning detection algorithms for sharks [41,42] (as well as other marine wildlife and beachgoers) are expanding and offer superior tools to be integrated with aerial surveillance for automated detection. Combining UAVs and machine learning with smart wearables like smartwatches creates an additional opportunity to build scalable systems to manage shark interactions with individuals. Again, Drones 2020, 4, 18 3 of 17 a recent technology expansion (mainly in 2017) has created an era of available smartwatches with cellular communication technology, offering the potential to link surveillance with personalized alerting. Our project was designed to overcome the gap between these emerging techniques for real- time personal shark detection by integrating aerial reconnaissance, smart image recognition, and wireless wearable technology. Sharkeye had two main objectives: (1) bridge data from aerial monitoring to alert smartwatches and (2) test the system as a pilot on location. Objective 1 required technical integration from the aerial acquisition of data to the app along the architecture stack including validating key components of hardware, software, and services (connectivity). To execute, our team developed an improved detection algorithm by training the system using machine learning based on aerial footage of real sharks, rays and surfers collected at local beaches; determined how to host and deploy the algorithm in the cloud; and expanded a shark app to run on smartwatches and push alerts to beachgoers in the water. For Objective 2, field trials were conducted at Surf Beach Kiama over a period of three days using methods for testing continuous aerial surveillance. To test detection accuracy, we deployed moving shark analogues as well as swimmers, surfers, and paddleboarders while recording the response through the use of multiple smartphones and three smartwatches simultaneously. In addition to the main objectives and related work, a backup emergency communication system between a swimmer in danger and a lifeguard was also tested, as well as deployment of a communication relay on the blimp to further demonstrate the potential of the Sharkeye platform. The system was successfully deployed in a trial with over 350 detection events recorded with an accuracy of 68.7%. This is a significant achievement considering that our algorithm was developed using <1 h of training footage, highlighting the huge potential of the system given more training data to improve confidence. Overall, Sharkeye showed potential in the convergence of drones, Internet of Things, machine learning, and smart wearables for human–animal management. We believe and advocate continued investment to leverage existing technology and utilize emerging technology to tackle both shark and human safety objectives. Further investment would significantly improve accuracy and prediction and expand identification. This capability to expand to other beach and water-based objects suggests the platform can be used to develop tools for effective and affordable beach management. A common tool that allows for real-time superior shark detection as well as monitoring beach usage, swimmer safety, pollution, etc., would help de-risk investment in R&D and deployment. Ultimately, as the Sharkeye system is not only platform agnostic, it has potential to assist cross-agency objectives of data collection and analysis for consumer safety, efficiency gains, market competitiveness, and even development for export, we strongly recommend resourcing funds to expand such work.

2. Materials and Methods The Sharkeye project developed an interface between two previous initiatives, Project Aerial Inflatable Remote Shark–Human Interaction Prevention (Project AIRSHIP), a blimp-based megafauna monitoring system, and Sharkmate, a predictive app based on the likelihood of shark presence. We successfully optimized the equipment and process, i.e., our platform, required to automate video collection from the blimp, send feed to a cloud-hosted algorithm, determine shark presence, and send alerts via smartwatches. The components of the system are described below.

2.1. Aerial Imaging from UAVs A blimp-based system, Project Aerial Inflatable Remote Shark–Human Interaction Prevention (Project AIRSHIP) was used for the aerial surveillance component of the Sharkeye project. Project AIRSHIP was developed with the intent to help mitigate the risk of shark–human interactions with zero by-catch; the blimp system was developed and used for several years before the Sharkeye project was initiated. The blimp uses a high-definition mounted camera, designed to provide continuous aerial coverage of swimming and surfing areas [36]. Drones 2020, 4, 18 4 of 17

As a first step of Sharkeye, we collected new footage of wildlife from the blimp to help train the detection algorithm. To collect new footage, the blimp-mounted camera system was tethered above the surf zone of Surf Beach in Kiama, NSW. For context on the observation area, Kiama Surf Beach is Dronesdescribed 2020, as4, x an FOR east-facing PEER REVIEW embayed beach, 270 m in length, that is exposed to most swell with average4 of 18 wave heights of 1–1.5 m [43]. Footage of the surf zone was generally collected from 11:00 am until average4:00 pm wave during heights a six-week of 1–1.5 period m [43]. (excluding Footage of weekends the surf zone and publicwas generally holidays) collected from December from 11:00 2017 am untilto January 4:00 pm 2018. during Deployment a six-week was period coordinated (excludi aroundng weekends the work and hours public of theholidays) professional from December lifeguard 2017service to atJanuary Kiama. 2018. Once Deployment the algorithm was was coordinate trained, thed around blimp was the redeployed work hours (May of the 2018) professional to test the lifeguardeffectiveness service of real-time at Kiama. detection. Once the algorithm was trained, the blimp was redeployed (May 2018) to test theImages effectiveness were collected of real-time via a detection. live streaming camera with 10x zoom on a 3-axis gimbal (Tarot PeeperImages camera were with collected custom modifications).via a live streaming This setup camera provided with 10x a relatively zoom on stable a 3-axis feed evengimbal in windy(Tarot Peeperconditions camera (>30 with km/ h),custom with modifications). footage transmitted This tosetu a groundp provided station a relatively at a resolution stable feed of 1080 even p. in To windy focus conditionsthe collection (>30 of km/h), images, with the footage direction transmitted and zoom to of a theground camera station was at controlled a resolution by anof 1080 operator p. To on focus the theground collection and was of images, visually the monitored direction onand an zoom HD screen.of the camera The blimp was controlled itself is 5 m by in an length operator and on 1.5 the m groundin diameter and (Airshipwas visually Solutions monitored Pty Ltd., on an Australia) HD screen. and The served blimp as itself a stable is 5 m platform in length for and the 1.5 camera. m in Thediameter blimp (Airship enabled Solutions collection Pty over Ltd., an 8-hourAustralia) collection and served period, as asa stable limited platform only by for the the camera camera. battery. The Theblimp blimp enabled was collection launched inover the an morning 8-hour andcollection taken downperiod, at as night. limited When only not by in the use, camera it was battery. stored fullyThe blimpinflated was to minimizelaunched heliumin the morning usage, taking and 9000taken L down of helium at night. for initial When inflation. not in use, The blimpit was losesstored helium fully inflatedat a rate ofto

Figure 1. Left: Left: ImageImage of of the the inflated inflated blimp, blimp, Project Project Aerial Aerial Inflatable Inflatable Remote Remote Shark–Human Shark–Human Interaction Interaction Prevention (Project AIRSHIP), with ground crew and tether at Kiama Surf Beach, NSW NSW (May (May 2018). 2018). Right: Typical viewview from from the the blimp blimp camera camera over over Kiama. Kiama.Inset: Inset:the imagethe image collection collection system system with power with powersupply, supply, camera camera on gimbal, on gimbal, communication communication system, system, and custom and custom mounting. mounting. The The system system allows allows for foroperators operators on theon beachthe beach to control to control direction direction and zoomand zoom on the on camera. the camera. Once inOnce the in air, the the air, video the feedvideo is passedfeed is passed to computers to computers at the beach. at the beach. 2.2. Detection Algorithm Development 2.2. Detection Algorithm Development The project employed a Deep Learning Neural Network (DNN) for the detection algorithm for variousThe animals project asemployed well as people a Deep in Learning the water Neural (sharks, Network rays, and (DNN) surfers). for Thethe detection following algorithm is a summary for variousof the dataset animals development, as well as people the network in the water architecture, (sharks, therays, algorithm and surfers). training The andfollowing evaluation. is a summary of theTo dataset create development, the algorithm, the raw network footage architecture, from the blimp-mounted the algorithm camera training was and collected evaluation. over several weeksTo at create a specific the algorithm, beach. This raw footage footage was from brought the blimp-mounted back to the laboratory camera to was be manuallycollected over screened several for weeks at a specific beach. This footage was brought back to the laboratory to be manually screened for the presence of marine life. The spliced footage was then used to extract images for algorithm creation and testing and divided into training and testing datasets. One of every five frames of the video selections were annotated with the locations and types of objects and each frame divided into 448×448 blocks with 50% overlapping. The training datasets where then processed in a DNN developed by our labs. Drones 2020, 4, 18 5 of 17

the presence of marine life. The spliced footage was then used to extract images for algorithm creation and testing and divided into training and testing datasets. One of every five frames of the video selections were annotated with the locations and types of objects and each frame divided into 448 448 × blocks with 50% overlapping. The training datasets where then processed in a DNN developed by Dronesour labs.2020, 4, x FOR PEER REVIEW 5 of 18 The network architecture was based on You Only Look Once (YOLO) as described in Redmon The network architecture was based on You Only Look Once (YOLO) as described in Redmon and Farhati 2017 [44,45]. It consisted of nineteen convolutional layers and five max-pooling layers and Farhati 2017 [44,45]. It consisted of nineteen convolutional layers and five max-pooling layers with a softmax classification output. It used 3 3 filters to extract features and double the number of with a softmax classification output. It used 3 × ×3 filters to extract features and double the number of channels after every pooling step, with twelve 3 3 convolutional layers in total. Between every 3 3 channels after every pooling step, with twelve 3 × 3 convolutional layers in total. Between every 3× × convolutional layers, 1 1 filters are used to compress the feature representation. There are seven 1 1 3 convolutional layers, 1× × 1 filters are used to compress the feature representation. There are seven ×1 convolutional layers. At the end of the network, three extra 3 3 convolutional layers with 1024 filters × 1 convolutional layers. At the end of the network, three extra× 3 × 3 convolutional layers with 1024 each followed by a final 1 1 convolutional layer with the number of outputs for detection are added filters each followed by a final× 1 × 1 convolutional layer with the number of outputs for detection are for detection. The network is augmented with a preprocessing component to process the video frames added for detection. The network is augmented with a preprocessing component to process the video before they are taken for training and detection in real-time. A visualization of the process is outlined frames before they are taken for training and detection in real-time. A visualization of the process is in Figure2. outlined in Figure 2.

Figure 2. Outline of steps of algorithm training. First, the original image is pre-processed into specific Figurefeature 2. areas.Outline Second, of steps those of algorithm sections are training. run through First, th Youe original Only Look image Once is pre-processed (YOLO)-like object into specific detection featureprocessing areas. (including Second, athose series sections of convolutional are run andthrough pooling You layers). Only Third, Look aOnce series (YOLO)-like of localized detection object detectionof results processing is obtained. (including Finally, these a series results of areconvolutional post-processed and topoolin formg thelayers). resulting Third, object a series detection of localizedfor analysis. detection of results is obtained. Finally, these results are post-processed to form the resulting object detection for analysis. The commonly used mini-batch strategy [44] was used to train the network, with a batch size of 64, aThe weight commonly decay ofused 0.0005 mini-batch and a momentum strategy [44] of was 0.9. used In this to case, train the the learningnetwork, rate with was a batch set to size 0.00001 of 64,and a weight max iteration decay of was 0.0005 set and to 45,000, a momentum with the of threshold 0.9. In this used case, for the detection learning rate set towas 0.6. set For to 0.00001 training, andthe max frequently iteration used was data set augmentationto 45,000, with techniques, the threshold including used for random detection crops, set rotations,to 0.6. For hue, training, saturation, the frequentlyand exposure used shifts,data augmentation were used [44 techniques,]. Once the incl algorithmuding random was created, crops, it rotations, was then hue, tested saturation, versus the andtesting exposure dataset shifts, in the were laboratories. used [44]. Once Once validated the algo onrithm the trainingwas created, dataset, it was the finalthen algorithmtested versus was the then testingready dataset for hosting in the and laboratories. deployment Once in the validated cloud-based on the architecture. training dataset, Results the from final the algorithm couldwas thenthen ready be pushed for hosting to additional and deployment alerting components in the cloud-based of the Sharkeye architecture. system. Results from the algorithm could then be pushed to additional alerting components of the Sharkeye system. 2.3. Personal Alerting via an App 2.3. PersonalWe expanded Alerting onvia thean App previously developed SharkMate app to provide the user interface (UI) to facilitateWe expanded real-time on alerting the previously accessible developed for beachgoers SharkMat in thee water.app to For provide context, the the user SharkMate interface app(UI) wasto facilitatedesigned real-time as a proof-of-principle alerting accessible demonstration, for beachgoe developedrs in the water. as a means For context, to aggregate the SharkMate data contributing app was to designeda range ofas riska proof-of-principle factors surrounding demonstration, shark incidents. devel Analysisoped as ofa means data from to aggregate the Australian data contributing Shark Attack toFile a range (ASAF, of Tarongarisk factors Zoo [surrounding46]) and International shark inciden Sharkts. Attack Analysis File (ISAFof data [47 from]) and the a rangeAustralian of correlation Shark Attack File (ASAF, Taronga Zoo [46]) and International Shark Attack File (ISAF [47]) and a range of correlation studies including Fisheries Occasional Publication No. 109, 2012 [48] found statistically significant risk factors contributing to shark incidents such as proximity to river mouths and recent incidents or sightings. It is important to acknowledge that sharks are wild animals and their movements and actions are not completely predictable. However, historical data does present factors that are significant in contributing to the risk of shark incidents. It is also important to distinguish factors that are purely correlative and are not causal. Drones 2020, 4, 18 6 of 17 studies including Fisheries Occasional Publication No. 109, 2012 [48] found statistically significant risk factors contributing to shark incidents such as proximity to river mouths and recent incidents or sightings. It is important to acknowledge that sharks are wild animals and their movements and actions are not completely predictable. However, historical data does present factors that are significant in contributing to the risk of shark incidents. It is also important to distinguish factors that are purely correlative and are not causal. The app derives its location-specific data from a range of sources. Climate data was sourced from the Australian Bureau of Meteorology (BOM). Geographic data was used to map and identify proximity of river mouths, outfalls and seal colonies. This was combined with data such as the presence of mitigation strategies (including the presence of blimp) and lifeguards/lifesavers. Historical incidents and sightings were sourced from ASAF and ISAF. Real-time sightings of sharks and bait balls from across Australia were scraped from verified Twitter accounts. This data was then aggregated to a centralised database. The app used a series of cloud-based programming steps to provide predictions. Information from the various sources is updated every hour. Scripts are used to retrieve information such as water temperature and the presence of lifeguards at the beach. These values are then run through a series of conditional statements, each individually scaled to reflect the respective impact they have on risk. The output of these conditions is a likelihood score, a number out of ten, designed to be easily compared to other beaches. The higher the number, the higher the risk. Finally, after the impact of all factors has been processed to generate an aggregate score, this value is then sent to the cloud, where it can be accessed from the SharkMate app. The functionality of the app was later expanded to facilitate real-time alerts from the blimp via Apple Watch. Prior to this work, the SharkMate iPhone app (using iOS) was near completion. However, the SharkMate smartwatch app (using smartOS) needed further development. Swift 4 and XCode9 were used to create and program both the iPhone and Apple Watch applications. As the smartwatch functions vary from the smartphones, the smartwatch app could not simply be converted. Regardless, push notifications were developed to alert the user of the presence of a shark displayed on the Apple Watch app. This was achieved by taking manual and automated triggers of cloud functions that then processed the data and pushed it over the cloud to the watch. With the app and watch validated, the components were complete to integrate the end-to-end alerting system.

2.4. Integration Across the Stack Integration was required to connect the various pieces of hardware and discrete custom and commercial software to execute real-time notifications of automatically detected sharks in the ocean via the smartwatches. To integrate these features, our team created the interface between:

The camera mounted to the blimp (or the video feed from a separate drone), • The receiving ground station, • The machine learning model (algorithm), • The smartwatches, and • The SmartMate app (iOS phone/watchOS). • Several additional pieces of software were required to facilitate integration. Firstly, a program was developed to initiate processing using the algorithm; this software transferred images captured from the live video feed and uploaded them to the cloud for processing. Secondly, once the images were processed from the algorithm, an additional script was created to process the outputs of the machine learning model and send appropriate push notification messages via Amazon Web Services SNS (simple notification service pub/sub messaging) integration with the Apple Push Notification Service. The data flow is outlined in Figure3. To test the system before deployment, the camera was removed from the blimp, and aerial shark images from the internet and from our training set were placed in front to simulate a live feed. This was done on a local beach to take into account Drones 2020, 4, 18 7 of 17 environmental conditions that might be observed during field testing. Although simplistic, the test run showed the integrated components worked; alerts were triggered, and notifications received on theDrones smartwatches. 2020, 4, x FOR PEER REVIEW 7 of 18

Camera Feed Cloud Trigger Detection Storage Function Algorithm

4

1

2 3

Alert Messaging

Video Relay

Figure 3. Diagram of the data flow for the Sharkeye system. (1) The blimp-mounted camera sends Figure 3. Diagram of the data flow for the Sharkeye system. (1) The blimp-mounted camera sends video to the ground station. (2) The ground station then automatically converts the videos for upload to video to the ground station. (2) The ground station then automatically converts the videos for upload the cloud. (3) Once the data is uploaded to the cloud service, it is stored, then triggered for evaluation to the cloud. (3) Once the data is uploaded to the cloud service, it is stored, then triggered for with the detection algorithm (hosted in a virtual server), and processed through the image recognition evaluation with the detection algorithm (hosted in a virtual server), and processed through the image model, and the results are prepared in a messaging script. (4) Alerts are then passed from the cloud to recognition model, and the results are prepared in a messaging script. (4) Alerts are then passed from smartwatches and smartphones. the cloud to smartwatches and smartphones. 2.5. Field Testing and Additional Drone-based Hosting 2.5. Field Testing and Additional Drone-based Hosting Once the individual Sharkeye components had been completed and validated, the system was thenOnce deployed the individual for field testing.Sharkeye Initially, components the blimphad been was completed inflated, andand thevalidated, camera the attached system towas the undersidethen deployed mounting. for field The testing. live camera Initially, feed the was blimp then establishedwas inflated, to laptopsand the on camera the beach attached and monitors to the inunderside the surf mounting. club. As before, The live to camera provide feed a preliminarily was then established check of to communication laptops on the beach in the and system, monitors aerial imagesin the surf of sharks club. As (via before, printouts) to provide were placed a prelimin in frontarily of check the camera, of communication the feed sent toin thethe cloudsystem, to processaerial theimages detection of sharks algorithm, (via printouts) and alerts were were placed monitored in front on severalof the camera, smart phones the feed and sent smartwatches. to the cloud to process the detection algorithm, and alerts were monitored on several smart phones and Once the check was complete, the blimp was launched to a height of approximately 60 m via a smartwatches. tether to the beach. A shark analogue (plywood cutout) was then placed on the sand and successfully Once the check was complete, the blimp was launched to a height of approximately 60 m via a detected as a verification to establish the system was working. The analogue was then deployed in the tether to the beach. A shark analogue (plywood cutout) was then placed on the sand and successfully surf zone and detection monitored on screens, on 3 smartwatches and on various phones with the detected as a verification to establish the system was working. The analogue was then deployed in app installed. Detection rates were recorded in the cloud. Alerts on multiple smartwatches were then the surf zone and detection monitored on screens, on 3 smartwatches and on various phones with tested while in the water near the shore with success. Images of the field testing are seen in Figure4. the app installed. Detection rates were recorded in the cloud. Alerts on multiple smartwatches were For environmental context, the weather conditions on the testing days were average, partially cloudy, then tested while in the water near the shore with success. Images of the field testing are seen in while the surf was higher than normal (see Supplementary Files for videos of the conditions and setup Figure 4. For environmental context, the weather conditions on the testing days were average, in action). partially cloudy, while the surf was higher than normal (see Supplementary Files for videos of the To test the system on a mobile platform, the camera feed was successfully changed to a drone conditions and setup in action). operated by a certificated drone pilot (DJI Phantom 4, Opterra Australia). A surfer wearing a To test the system on a mobile platform, the camera feed was successfully changed to a drone smartwatch was deployed, the drone was then set to observe, and send the feed to the cloud as with operated by a certificated drone pilot (DJI Phantom 4, Opterra Australia). A surfer wearing a thesmartwatch blimp. In was that deployed, case, the the surfer drone received was then alerts set to at observe, 150 m from and shore.send the Additionally, feed to the cloud while as filming, with athe stingray blimp. wasIn that observed case, the from surfer the received drone. Thealerts team at 150 took m advantagefrom shore. ofAdditionally, the situation while and wasfilming, able a to detectstingray the was native observed wildlife from in the the field drone. and The with team alerts. took Finally, advantage we deployed of the situation a stand upand paddleboarder was able to (SUP)detect withthe native a smartwatch wildlife in and the observed field and detectionwith alerts. via Finally, the system we deployed simultaneously a stand onup thepaddleboarder beach and in the(SUP) water. with a smartwatch and observed detection via the system simultaneously on the beach and in the water. Drones 2020, 4, 18 8 of 17 Drones 2020, 4, x FOR PEER REVIEW 8 of 18

Figure 4.FigureImages 4. Images of the of testingthe testing protocol. protocol. TopTop Left Left: View: View from fromthe ground the groundof the blimp of deployed. the blimp Top deployed. Top Middle:Middle:Shark Shark analogue used used for testing. for testing. Top Right:Top View Right: of the beachView from of the the blimp. beach Bottom from Left: the blimp. View from ground near the Kiama surf club. Bottom Right: Image of team performing testing at Bottom Left: View from ground near the Kiama surf club. Bottom Right: Image of team performing Kiama Surf Beach. testing at Kiama Surf Beach. 3. Results and Discussion 3. Results and Discussion Overall, our Sharkeye system was successfully deployed during the field trial, demonstrating Overall,that automated our Sharkeye real-time system personal was successfullyshark detection deployed is possible during using theexisting field technologies: trial, demonstrating aerial that automatedimaging, real-time cloud-based personal machine shark learning detection analysis, is possible and alerting using via smartwatches. existing technologies: aerial imaging, cloud-based3.1. Creating machine the learningDetection Algorithm analysis, and alerting via smartwatches. 3.1. Creating theIn order Detection to develop Algorithm the dataset to train the algorithm, the blimp system captured over 100 hours of raw footage over a six-week period. As mentioned, the footage was then manually screened at the In orderlaboratory to develop for the the presence dataset of tomarine train thelife. algorithm,Upon review, the multiple blimp systemsharks were captured sighted over during 100 h of raw footage overapproximately a six-week 30 period. mins of total As mentioned, time. Other marine the footage animals was were then far more manually commonly screened observed, at the with laboratory for the presencestingrays ofspotted marine at least life. 50 Upon times, review, as well numero multipleus seals sharks and werebaitfish sighted schools. duringIt should approximately be noted 30 that during the data collection, the monitored video feed also provided a beneficial vantage point for min of totallifeguards. time. OtherOne rescue marine was animals also observed were from far more the blimp commonly during the observed, original withfootage stingrays collection. spotted at least 50 times,Additionally, as well lifeguards numerous monitoring seals the and video baitfish feed identified schools. sharks It should in real time be noted on several that occasions. during the data collection,On the one monitored of these occasions, video feedthe blimp also providedoperator spo atted beneficial a shark vantagein close proximity point for to lifeguards.a body boarder One rescue was alsoand observed alerted lifeguards from the who blimp evacuated during swimmers the original from the footage water. collection. Additionally, lifeguards Four video sequences, one hour and seven minutes in total, containing sharks, stingrays and monitoringsurfers the videowere chosen feed identifiedfrom the video sharks footage in real collec timeted by on a several blimp overlooking occasions. the On beach one and of these ocean. occasions, the blimpAfter operator analysis spotted this led ato sharka dataset in having close proximity~10,805 annotated to a body images. boarder The dataset and was alerted then divided lifeguards who evacuatedrandomly swimmers into three from subsets: the water. 6915 images for training, 1730 images for validation, and 2160 images Fourfor video off-line sequences, performance one evaluation hour. and This seventraining minutes set was extremely in total, limited, containing particularly sharks, for sharks; stingrays and surfers werebased chosen on <30 mins from of footag the videoe of sharks footage seen in collected the beach area by adu blimpring operation overlooking of the blimp. the beachHowever, and ocean. even with this limited training data the system was able to detect various objects from the images set After analysisaside for this testing led with to a confidence. dataset having Using a ~10,805standard accuracy annotated calculation images. (number The datasetof correctly was detected then divided randomlyobjects into three+ number subsets: of correctly 6915 detec imagested "no" for objects/total training, 1730instances), images evaluation for validation, in the lab showed and 2160 an images for off-line performance evaluation. This training set was extremely limited, particularly for sharks; based on <30 min of footage of sharks seen in the beach area during operation of the blimp. However, even with this limited training data the system was able to detect various objects from the images set aside for testing with confidence. Using a standard accuracy calculation (number of correctly detected objects + number of correctly detected "no" objects/total instances), evaluation in the lab showed an accuracy of 91.67%, 94.52% and 86.27% for sharks, stingrays, and surfers, respectively, using the testing dataset. Drones 2020, 4, x FOR PEER REVIEW 9 of 18

Drones 2020, 4, 18 9 of 17 accuracy of 91.67%, 94.52% and 86.27% for sharks, stingrays, and surfers, respectively, using the testing dataset. The authors acknowledge the limitations of the al algorithmgorithm in the context of deployment for shark spotting. The The utility utility of of the the training training and and testing testing da datasetstasets was was directly linked to the conditions seen during the recording of the subject at hand: whether whether shark, shark, ray, ray, or surfer. This This resulting resulting environment and activity during the sighting, sighting, including including weather, weather, wave wave activity, activity, glare (time of day), animal animal depth, depth, etc., will all have an impact on the ability and accu accuracyracy of the resulting algorithm to detect a subject in didifferentfferent conditions.conditions. For For the the footage footage used used for the algorithm training, the conditions would be described as average and mild for periods of the Australian summer in the Illawarra region, ranging from overcast to partially cloudy. On reflection, reflection, developing the detection algorith algorithmm for sharks, stingrays and surfers proved more challenging than conventional objects due to a number of reasons. First, these objects are typically underwater causing significant significant deformation deformation of of their their outlines outlines that that makes makes defining defining borders borders difficult. difficult. Second, the colour of the objects can vary due to the depth in the the water, water, meaning the algorithm must be able to distinguish an object under multiple conditions. Third, when collecting video from aerial platforms, the objects can be too smallsmall to use due to the distance fromfrom the camera. Fourth, it can be challenging toto collectcollect the the amount amount of of footage footage that that is requiredis required to haveto have enough enough images images to eff toectively effectively train traina deep a deep neural neural network network with with accuracy accuracy and precisionand precision (i.e., (i.e., footage footage of sharks of sharks at Kiama); at Kiama); as well as well as the as thescarcity scarcity of sharks of sharks to validate to validate a trained a trained algorithm algorithm (i.e. resulting (i.e. resulting in the need in to the deploy need shark to deploy analogues shark to analoguestest). With to access test). to With further access training to further data and training access data to sharks and access in the wild,to sharks in a varietyin the wild, of environmental in a variety ofconditions, environmental we anticipate conditions, that we the anticipate accuracy canthat bethe greatly accuracy expanded can be greatly and improved. expanded and improved.

3.2. Design Design and and Testing Testing the Smartwatch User Interface The useruser interfaceinterface and and design design of theof the watch watch app wasapp identifiedwas identified as a crucial as a elementcrucial element of the Sharkeye of the Sharkeyesystem. Users system. in the Users water in havethe water limited have ability limited to control ability theto control watch duethe watch to moisture due to on moistu the glassre on when the glassin the when surf. Becausein the surf. of this, Because the UI of and this, experience the UI an wered experience designed were to be accessibledesigned andto be responsive accessible in and an responsiveaquatic environment. in an aquatic The environment. smartwatch The app smartwatch was setupto app receive was setup the message to receive that the a sharkmessage has that been a sharkspotted, has and been the spotted, user should and exitthe theuser water. should This exit was the presentedwater. This in thewas form presented of attention-grabbing in the form of attention-grabbingactions across three actions senses (sight,across sound, three senses and feel) (sig thatht, weresound, performed and feel) by that the were watch. performed Besides a by visual the watch.notification, Besides the a speakersvisual notificati in the watchon, the also speakers produce in the a sound. watch Vibrationalso produce was a also sound. ultilised Vibration to further was alsoensure ultilised that someone to further wearing ensure thethat watch someone was wear alertedingto the the watch potential was alerted danger. to Additional the potential alerts danger. were Additionalsimultaneously alerts sent were to simultaneously the smart phone sent (examples to the smart of alerts phone are seen(examples in Figure of alerts5). are seen in Figure 5).

Figure 5. ExamplesExamples of of user user interface interface (UI) (UI) image image of of the the Sharkmate App App Alerts on watch and on smartphone. The The measure measure ‘confidenc ‘confidencee danger’ danger’ was was a a gauge gauge how how confident confident the detection of the Deep Learning neural network was forfor the detection event observed. Drones 2020, 4, 18 10 of 17

Overall connectivity of new wearables was an important factor in app development. The Apple Watch used in the demonstration was one of the only smartwatches that was waterproof and also boasted GPS and cellular capabilities at the time of demonstration; this ultimately allowed for use in the surf and the required long distance connectivity for the system to receive alerts from the cloud. However, there were difficulties in translating the iPhone app to the Apple Watch app due to the network limitations with the wearable; communication with the database is sometimes limited when the user is not connected to an internet source or SIM card. The user interface is completely different on the two devices due to the different dimensions that required additional development in creating the UI. Certainly, the potential of the system is not limited to Apple products and other wearables and operating systems could be used in the future.

3.3. Real-Time Aerial Shark Detection and Alerting During the deployment trial period, 373 detection events were recorded. A detection event was defined as an automatic alert from the system onto the smartwatch and/or smart phone, and which was then validated by a researcher against the truth/falsity of the AI detection. These alerts occurred with a latency of less than 1 min from image collection to notification. The detection events could be broken down into detected sharks, rays, or surfers (as seen in Figure6). To analyse the events, we examined whether there was an actual shark analogue, ray or surfer being imaged and also detected by the algorithm. The output of the detection event was manually annotated as falling into one the categories (shark analogue, ray, surfer) as the ground truth. The data was then run through a classifier, and we compared the ground truth for both raw counts and for binary presence/absence classification. This arrangement was successfully able to detect shark analogues, surfers, and rays with an accuracy of 68.7% when considering a simple no presence of an animal versus presence of an animal. While a suitable accuracy for a demonstration, our testing showed ample room for improvement. The system had trouble distinguishing between sharks, rays and humans with perfect confidence and most often mis-classified objects as rays. Further, at times, artefacts causing false positives were observed; the system thought shadows of waves were animals, or shadows from the blimp or drone itself. However, from the context of a proof of principle, once trained the algorithm performed sufficiently well in a different situation than it had been taught and effectively picked up objects darker than surroundings. The reduction in accuracy from the training dataset and the field data could be due in part to the differing conditions seen between the footage collection periods. The demonstration took place in autumn, 5 months after the training data was collected in the summer. Further, while the weather was somewhat comparable, mild and partially sunny, the swell was notably bigger during the testing period. Obviously with further training, especially using more footage from on-beach locations in varied conditions, the algorithm would be improved for both the ability to pick up real objects and the ability to distinguish between them. Further work in the post-image capture analysis could also improve the system significantly; particularly methodologies that evaluate the detection in context. For instance, techniques like those used by Hodgson et al. [49] and replicated by other groups [19], where the observations are related to parameters like sea visibility, glare and glitter, and sea state, would not only help to understand the accuracy, but could also potentially be used to further improve the detection algorithm. In terms of differences between aerial platforms, there was little difference in image quality between the blimp-mounted camera and the quadcopter drone camera. The ground station used over the last few seasons at the beach had an effect on the image quality and had some reception issues, highlighting the need to consider a casing for this equipment to prevent corrosion from the sea air. The area of coverage is dependent on the height of the blimp (restricted to 120 m or 400 ft). We deployed the blimp at 70 m to provide complete coverage of a beach that is 270 m in length to a distance of 250 m offshore. This ensures both the flagged swimming area and the surf banks were covered by the blimp. On longer beaches, the blimp may be deployed higher (to a limit of 120 m) to Drones 2020, 4, 18 11 of 17 achieve a larger area of coverage. However, this may reduce the accuracy of observers/algorithms,

Dronesso the 2020 area, 4, of x FOR coverage PEER REVIEW and accuracy would need to be assessed and optimized. 11 of 18

Figure 6. Top: Detection of our shark analogue. Middle:Middle: ThoughThough a a fortuitous fortuitous encounter encounter with with a a ray, ray, we were able to test the system observing andand detecting aa real animal duringduring thethe trial.trial. Bottom: Several surfers were out enjoying the waves, and we used the the opportunity opportunity to to test test the the algorithm algorithm with with success. success. Additionally, aa paddleboarderpaddleboarder (SUP)(SUP) waswas setset outout toto testtest thethe algorithm.algorithm. Inserts: Sample screenshots of alerts displayed on the smartwatches.

In terms of differences between aerial platforms, there was little difference in image quality between the blimp-mounted camera and the quadcopter drone camera. The ground station used over the last few seasons at the beach had an effect on the image quality and had some reception issues, highlighting the need to consider a casing for this equipment to prevent corrosion from the sea air. The area of coverage is dependent on the height of the blimp (restricted to 120 m or 400 ft). We Drones 2020, 4, x FOR PEER REVIEW 12 of 18

deployed the blimp at 70 m to provide complete coverage of a beach that is 270 m in length to a distance of 250 m offshore. This ensures both the flagged swimming area and the surf banks were covered by the blimp. On longer beaches, the blimp may be deployed higher (to a limit of 120 m) to Drones 2020, 4, 18 12 of 17 achieve a larger area of coverage. However, this may reduce the accuracy of observers/algorithms, so the area of coverage and accuracy would need to be assessed and optimized. 3.4. Platform for additional Lifesaving Technology 3.4. Platform for additional Lifesaving Technology In addition to the notification system of sharks and rays to surfers and swimmers, our team was able toIn use addition the smartwatch to the notification app to provide system a of secondary sharks and emergency rays to surfers notification and swimmers, system. In our an team emergency, was surfersable to and use swimmers the smartwatch who need app help to could provide use thea secondary watch app emergency to send a notificationnotification to system. life savers In withan emergency, surfers and swimmers who need help could use the watch app to send a notification to the fact they need help as well as the location reported by the watch app. To achieve this, we added a life savers with the fact they need help as well as the location reported by the watch app. To achieve simple web API that the smartwatch app could call, which would then pass on the details via email. this, we added a simple web API that the smartwatch app could call, which would then pass on the We integrated this functionality to the SharkMate app on the smartwatch by adding a map and alert details via email. We integrated this functionality to the SharkMate app on the smartwatch by adding button to an alert section in the watch app UI to send out that alert using the web API. The alerting a map and alert button to an alert section in the watch app UI to send out that alert using the web system is shown in Figure7, followed by the response email as displayed on a lifesaver’s watch shown API. The alerting system is shown in Figure 7, followed by the response email as displayed on a below.lifesaver’s To demonstrate watch shown the below. system, To two demonstrate tests were carriedthe system, out—one two tests with were a surfer carried in trouble out—one and with another a withsurfer a swimmer in trouble in trouble—ending and another with with a a mock swimmer rescue. in The trouble—ending demonstration with proved a mock the viability rescue.of usingThe thedemonstration platform for proved in-water the emergencies, viability o including potential use during a shark interaction.

FigureFigure 7. 7. Top Top left: left:Screenshots Screenshots ofof watchwatch emergencyemergency notificati notificationon system system from from the the perspective perspective of ofa a personperson in in the the water. water.Top Top right: right:Image Image ofof team members, members, Robert Robert (Surf (Surf Life Life Saving Saving Australia—Bellambi) Australia—Bellambi) andand Kye Kye (Professional (Professional Lifeguard Lifeguard Kiama Kiama City City Beach), Beach), in testingin testing with with notification notification on the on lifesaversthe lifesavers watch whenwatch emergency when emergency system was system activated. was activated.Bottom left:BottomPicture left: of Picture surfer of in watersurfer activatingin water activating the emergency the SOS.emergencyBottom SOS. right: BottomDemonstration right: Demonstration of a swimmer of a swimmer in trouble; in a trou teamble; member a team member was deployed was deployed into the surf,into the the SOS surf, was the activated,SOS was activated, and a mock and rescue a mock was rescue completed was completed to bring to the bring swimmer the swimmer back to back shore. to shore. 3.5. Overcoming Deadzones—Creating an On-Demand IoT Network 3.5. Overcoming Deadzones—Creating an On-Demand IoT Network The emergence of Internet of Things (IoT) has led to new types of wireless data communication, offeringThe long-range, emergence low-power of Internet andof Things low-bandwidth (IoT) has led connectivity. to new types Those of wireless new technologiesdata communication, are using non-licensedoffering long-range, radio frequencies low-power to communicate.and low-bandwidth With theseconnectivity. new protocols, Those new devices technologies can last forare years using on a singlenon-licensed battery, radio while frequencies communicating to communicate. from distances With of these over new 100 km.protocols, A major devices open can source last initiative for years in thison space a single is The battery, Things while Network, communicating which offers from a fully di distributedstances of over IoT data100 infrastructurekm. A major basedopen source on LoRa (shortinitiative for Long in this Range) space technologyis The Things [50 Network,]. The University which offers of Wollongong a fully distributed and the IoT SMART data infrastructure infrastructure based on LoRa (short for Long Range) technology [50]. The University of Wollongong and the facility have deployed multiple gateways offering free access to the Things Network infrastructure in SMART infrastructure facility have deployed multiple gateways offering free access to the Things the Illawarra area [51]. An advantage of this infrastructure is that the network can be easily expanded Network infrastructure in the Illawarra area [51]. An advantage of this infrastructure is that the by connecting new gateways, increasing the coverage of the radio network. Ultimately, such networks could be used to power and collect data from sensors and other instruments across local beaches. The use of the blimp allowed us to show the possibility of extending the network even over the ocean by integrating a nano-gateway. A nano-gateway was designed by the SMART IoT hub, composed of a LoRa Gateway module (RAK831) coupled to a Raspberry pi zero. The LoRa Gateway module Drones 2020, 4, x FOR PEER REVIEW 13 of 18

network can be easily expanded by connecting new gateways, increasing the coverage of the radio network. Ultimately, such networks could be used to power and collect data from sensors and other instruments across local beaches. DronesThe2020 ,use4, 18 of the blimp allowed us to show the possibility of extending the network even over13 of the 17 ocean by integrating a nano-gateway. A nano-gateway was designed by the SMART IoT hub, demodulatescomposed of LoRaa LoRa signals Gateway received module from (RAK831) sensors, coupled while the to Raspberry a Raspberry pi zeropi zero. uses The 4G LoRa connectivity Gateway to pushmodule data demodulates to the cloud. LoRa This nano-gatewaysignals received is powered from sensors, by battery, while which the oRaspberryffers the possibility pi zero uses to easily 4G expandconnectivity the network to push todata isolated to the areas cloud. at aThis very nano low-gateway cost. Embedded is powered in the by blimp,battery, the which gateway offers off theers largepossibility area radio to easily coverage. expand To the map network the signal, to isolated we used areas a mobile at a very device low that cost. sends Embedded messages in containingthe blimp, GPSthe gateway coordinates offers through large area the LoRaradio network.coverage. When To map the the gateway signal, receives we used those a mobile signals, device meta-data that sends can bemessages extracted containing from the GPS message, coordinates displaying through the position the LoRa of network. the sensor When on a mapthe gateway and the strengthreceives ofthose the signal.signals, Positionsmeta-data are can displayed be extracted live from on the the website message https:, displaying//ttnmapper.org the position/, which of the is sensor crowd-sourced on a map mapand the coverage strength of theof Thingsthe signal. Network Positions [52]. are The di mobilesplayed device live wason the taken website into the https://ttnmapper.org/, sea by a lifeguard on awhich paddleboard is crowd-sourced to test the map coverage coverage of theof the nano-gateway. Things Network As shown[52]. The in mobile the experiments device was in taken Figure into8, the sea signal by reporteda lifeguard locations on a paddleboard even at a distance to test th ofe coverage 1 km, which of the was nano-gateway. the extent of As the shown experiment. in the Forexperiments beach surveillance, in Figure 8 the, the demonstration signal reported shows locations potential even toat quicklya distance deploy of 1 km, an autonomouswhich was the solution extent of the experiment. For beach surveillance, the demonstration shows potential to quickly deploy an where classical internet connection is not available. autonomous solution where classical internet connection is not available.

Figure 8. Top Left: IoT Tracking on the Things Network. Top Middle: the IoT Gateway packaged Figure 8. Top Left: IoT Tracking on the Things Network. Top Middle: the IoT Gateway packaged with battery and secured to the blimp. Top Right: Live view of the Gateway launched on the blimp with battery and secured to the blimp. Top Right: Live view of the Gateway launched on the blimp connected to the network during testing at Kiama. Bottom: Landscape view of tracking area. Blimp connected to the network during testing at Kiama. Bottom: Landscape view of tracking area. Blimp can be seen at the upper left part of the picture (arrowed)—the small dot in the middle is a lifeguard can be seen at the upper left part of the picture (arrowed)—the small dot in the middle is a lifeguard on a paddleboard with the beacon (arrowed). Tracking was achieved out to a shark sonar buoy on a paddleboard with the beacon (arrowed). Tracking was achieved out to a shark sonar buoy ~1 km ~1 km offshore. offshore. 3.6. Considerations for Future Deployment. 3.6. Considerations for Future Deployment. The system has the potential to improve on the more traditional view of aerial surveillance that requiresThe operators system has to analysethe potential images to andimprove more on critically the more requires traditional lifeguards view to of audibly aerial surveillance alert beachgoers. that Conceptualizingrequires operators future to deploymentanalyse images to a largeand numbermore critically of beaches, requires we would lifeguards recommend to audibly streamlining alert thebeachgoers. hardware Conceptualizing on the beach with future a view deployment to minimize to a setup large requirements number of beaches, and cost. we Critically would recommend we would explorestreamlining where the it makeshardware sense on to the do thebeach process with computing a view to “atminimize the edge” setup (on therequirements beach) sending and cost. only essentialCritically datawe would or notifications explore where via the it makes cloud, sense given to the do bandwidththe process constraintscomputing (that“at the may edge” vary (on from the beachbeach) to sending beach) asonly well essential as operating data or costs. notifications via the cloud, given the bandwidth constraints (that Tomay expand vary from the algorithm beach to beach) for other as beacheswell as operating and scenarios, costs. more video footage would need to be collected, under a wide range of weather and beach conditions, and used to train the algorithm. Further research on the network architecture is needed so the architecture would be able to deal with varied conditions to improve the detection accuracy. Generally, the network and training procedure can be Drones 2020, 4, 18 14 of 17 easily and efficiently scaled to multiple objects; we demonstrated the use of detection for sharks, rays, and surfers simultaneously. Scaling again would require sufficient video footage to teach the algorithm and scaled to multiple sites with more cloud processing resources. The system is also algorithm “agnostic”, i.e., capable of hosting new independent algorithms developed by external groups. With regards to the coverage of the Sharkeye system and alerting potential of the SmartMate app, future work could encapsulate the expansion of beaches affiliated with the likelihood score (currently, there are approximately 100 beaches predominantly in NSW, Australia) and the inclusion of further research into specific risk factors. As a proof of principle, in its current form, the SharkMate app has both provided a platform for users to develop a greater understanding of the local risk factors contributing to shark incidents and also acted as a means to broadcast real-time shark sightings by the blimp. Expansion of the work may see the introduction of machine learning into the risk models as it has proved successful in the other elements of this project. Further, as wearable technology continues to improve and capabilities extend, there is potential to enhance the communication between lifeguards and the beachgoers through a range of mediums other than audio/text alert. Improvements in connectivity could allow lifeguards to communicate at a significant distance through video and audio in real-time through wearables—a possible enhancement for the SharkMate app. Such improvements extend beyond shark incident mitigation into other aspects of beach safety. The alerting aspect of the Sharkeye platform could also integrate into other existing or future apps utilized for shark monitoring. The authors would like to add a word of caution that users should never only depend on an app, Sharkmate or otherwise, as the basis for deciding on the risk of being in the water. Technology will never replace common sense, particularly when in areas where sharks are known to frequent and especially when detection conditions are poor. Although tested and trained using footage from a novel blimp-based aerial platform, the system is cross-compatible with any video-based aerial surveillance method and potentially the only automated system proven on both blimps and drones. Given the advantages and shortcomings of these various surveillance methods to the application of shark safety [14,36], it is likely that a mosaic approach will be most effective, whereby blimps and balloons, drones and traditional aircraft are used alongside each other to provide appropriate coverage. Therefore, by designing the system to be compatible with any method of image capture, we allow for increased uptake potential and a broader pool of application.

4. Conclusions Through limited training (via self-collected footage), we were able to develop a detection algorithm that could identify several distinct objects (sharks, rays, surfers, paddleboarders) and report to various smart devices. However, as described in the discussion, with more training and optimization, the algorithm can be significantly improved for accuracy in prediction. There is also the possibility to expand to a host of other beach- and water-based objects. With the basis of the system in place, the algorithm could be expanded to look at a variety of relevant aspects of shark management, including rapid identification of various species, behavioral patterns for specific shark species, and interspecies relationships (e.g., following bait balls), potentially offering new ways of identifying (and ultimately preventing) what triggers attacks with humans. The Sharkeye system brings Industry 4.0 to shark spotting and sits at the flourishing nexus of drones, sensors, the Internet of Things (IoT), AI, edge and cloud computing. However, our system has the potential to go well beyond looking at specific animals; it can be used as an effective tool for identifying other threats to assist in beach management. Further data collection and algorithm development could enable a system that could monitor beach usage and safety (such as swimmers in trouble), identify rips and other conditions, and then warn beachgoers, as well as monitor pollution, water and air quality, all the while simultaneously offering a platform for real-time alerting for the presence of sharks. We believe widespread deployment is truly feasible if the platform provides a range of enhancements for public safety and integrates as a tool for lifeguards/lifesavers, councils, and other stakeholders alike. This strategy would reduce investment risk and provide greater returns than Drones 2020, 4, 18 15 of 17 resourcing various discrete piece-wise shark-specific technologies at individual beaches. We heavily caution that although automation shows promise for providing safer beaches, it should be seen as a tool for lifeguards/lifesavers rather than as a replacement. If a shark is sighted from the blimp, professionals must be notified via smartwatch alerts to verify and respond. We would also like to see detection algorithms tested in continuous and varied real-life conditions to verify their reliability; this is essential to ensure that the public has confidence in an automated system that would work with human observers.

Supplementary Materials: The following are available online at http://www.mdpi.com/2504-446X/4/2/18/s1, Video S1: Kiama Field Testing Overhead Perspective; Video S2: Kiama Field Testing Sea View Perspective; Video S3: Kiama Field Testing Side Perspective; Video S4: Kiama Field Testing Top Aerial Perspective Author Contributions: Conceptualization, R.G., K.A., M.J.B., and W.L; methodology, K.A., M.J.B., W.L. and R.G.; software, M.J.B., S.A., J.B. and W.L.; validation, R.G., K.A., M.J.B, and S.A.; formal analysis, W.L., M.J.B., and J.B.; investigation, R.G., K.A., M.J.B, S.A., and W.L.; resources, K.A., M.J.B., J.B. and R.G.; data curation, M.J.B., S.A., and W.L.; writing—original draft preparation, R.G.; writing—review and editing, R.G., K.A., J.B., A.R.D., S.A., and W.L.; visualization, R.G.; supervision, A.R.D. and W.L.; project administration, R.G.; funding acquisition, R.G. All authors have read and agreed to the published version of the manuscript. Funding: This research was funded by the NSW Department of Primary Industries (DPI), Shark Management Strategy in NSW Australia. Acknowledgments: This project has drawn heavily on expertise, techniques and equipment developed as a direct result of financial and in-kind support by a variety of organisations. Kye Adams and Andy Davis wish to acknowledge financial and in-kind support from Kiama Municipal Council, the Department of Primary Industries, the Holsworth Research Endowment, Skipp Surfboards, the Save Our Seas Foundation and the Global Challenges Program at the University of Wollongong. For assistance in digitizing many thousands of images to develop training datasets for previous algorithms, we thank Douglas Reeves and David Ruiz Garcia. We’d also like to thank Douglas Reeves and Allison Broad for assistance in field testing. Sam Aubin would like to acknowledge the support from David Powter, Ann Sudmalis MP, Hon Arthur Sinodinos AO, Matt Kean MP, Gareth Ward MP, Gavin McClure, Kate McNiven, Jean Burton, Kiama Business Chamber and Kiama Council. Wanqing Li wishes to thank the visiting PhD student, Fei (Sheldon) Li, from Northwestern Polytechnical University, China, for his direct contribution to the project. Johan Barthelemy would like to recognize the support for the Digital Living Lab, Benoit Passot and Nicolas Veerstavel for assistance with the IoT Gateway deployment, development, and related testing during the project. Matthew Berryman would like to recognize the support from the Amazon Web Services Cloud Credits for Research program. Robert Gorkin would like to thank Rylan Loemker from Opterra for assistance with drone testing and Kihan Garcia from Rise Above for drone expertise. Robert would also like to thank the Bellambi SLSC for training exposure during the project. Conflicts of Interest: The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

1. Yearly Worldwide Shark Attack Summary. Available online: https://www.floridamuseum.ufl.edu/shark- attacks/yearly-worldwide-summary/ (accessed on 8 March 2020). 2. Simmons, P.; Mehmet, M.I. Shark management strategy policy considerations: Community preferences, reasoning and speculations. Mar. Policy 2018, 96, 111–119. [CrossRef] 3. NSW. Shark Meshing (Bather Protection) Program; NSW Department of Primary Industries: Orange, Australia, 2015; p. 4. 4. Davis, A.R.; Broad, A. Tightening the net: In search of alternatives to destructive fishing and meshing programs. Available online: http://drainmag.com/junk-ocean/ (accessed on 9 March 2020). 5. Gibbs, L.; Warren, A. Transforming shark hazard policy: Learning from ocean-users and shark encounter in Western Australia. Mar. Policy 2015, 58, 116–124. [CrossRef] 6. Mitchell, H. Report on November Meeting: Australian Aerial Patrol; lllawarra Historical Society Inc.: Wollongong, Australia, 1997; p. 3. 7. Surf Live Saving Western Australia Shark Safety. Available online: https://www.mybeach.com.au/safety- rescue-services/beach-safety/shark-safety/ (accessed on 8 March 2020). 8. Robbins, W.D.; Peddemors, V.M.; Kennelly, S.J. Assessment of Shark Sighting Rates by Aerial Beach Patrols; Cronulla Fisheries Research Centre of Excellence: Cronulla, Australia, 2020; p. 40. Drones 2020, 4, 18 16 of 17

9. Leadbitter, D. Submission to the Senate Inquiry into Shark Mitigation and Deterrent Measures; NSW: Orange, Australia, 2017. 10. Lowe, M.K.; Adnan, F.A.F.; Hamylton, S.M.; Carvalho, R.C.; Woodroffe, C.D. Assessing Reef-Island Shoreline Change Using UAV-Derived Orthomosaics and Digital Surface Models. Drones 2019, 3, 44. [CrossRef] 11. Casella, E.; Drechsel, J.; Winter, C.; Benninghoff, M.; Rovere, A. Accuracy of sand beach topography surveying by drones and photogrammetry. Geo. Mar. Lett. 2020, 1–14. [CrossRef] 12. Jiménez López, J.; Mulero-Pázmány, M. Drones for Conservation in Protected Areas: Present and Future. Drones 2019, 3, 10. [CrossRef] 13. Harris, J.M.; Nelson, J.A.; Rieucau, G.; Broussard, W.P. Use of Drones in Fishery Science. Trans. Amer. Fish. Soc. 2019, 148, 687–697. [CrossRef] 14. Colefax, A.P.; Butcher, P.A.; Kelaher, B.P. The potential for unmanned aerial vehicles (UAVs) to conduct marine fauna surveys in place of manned aircraft. Ices J. Mar. Sci. 2018, 75, 1–8. [CrossRef] 15. Kelaher, B.P.; Colefax, A.P.; Tagliafico, A.; Bishop, M.J.; Giles, A.; Butcher, P.A. Assessing variation in assemblages of large marine fauna off ocean beaches using drones. Mar. Freshw. Res. 2019, 71, 68–77. [CrossRef] 16. Raoult, V.; Gaston, T.F. Rapid biomass and size-frequency estimates of edible jellyfish populations using drones. Fish. Res. 2018, 207, 160–164. [CrossRef] 17. Schofield, G.; Esteban, N.; Katselidis, K.A.; Hays, G.C. Drones for research on sea turtles and other marine vertebrates—A review. Biol. Conserv. 2019, 238, 108214. [CrossRef] 18. Rees, A.; Avens, L.; Ballorain, K.; Bevan, E.; Broderick, A.; Carthy, R.; Christianen, M.; Duclos, G.; Heithaus, M.; Johnston, D.; et al. The potential of unmanned aerial systems for sea turtle research and conservation: A review and future directions. Endang. Species. Res. 2018, 35, 81–100. [CrossRef] 19. Schofield, G.; Katselidis, K.A.; Lilley, M.K.S.; Reina, R.D.; Hays, G.C. Detecting elusive aspects of wildlife ecology using drones: New insights on the mating dynamics and operational sex ratios of sea turtles. Funct. Ecol. 2017, 31, 2310–2319. [CrossRef] 20. Bevan, E.; Whiting, S.; Tucker, T.; Guinea, M.; Raith, A.; Douglas, R. Measuring behavioral responses of sea turtles, saltwater crocodiles, and crested terns to drone disturbance to define ethical operating thresholds. Plos One 2018, 13. [CrossRef][PubMed] 21. Fettermann, T.; Fiori, L.; Bader, M.; Doshi, A.; Breen, D.; Stockin, K.A.; Bollard, B. Behaviour reactions of bottlenose dolphins (Tursiops truncatus) to multirotor Unmanned Aerial Vehicles (UAVs). Sci. Rep. 2019, 9, 1–9. [CrossRef][PubMed] 22. Fiori, L.; Martinez, E.; Bader, M.K.-F.; Orams, M.B.; Bollard, B. Insights into the use of an unmanned aerial vehicle (UAV) to investigate the behavior of humpback whales (Megaptera novaeangliae) in Vava’u, Kingdom of Tonga. Mar. Mammal Sci. 2020, 36, 209–223. [CrossRef] 23. Horton, T.W.; Hauser, N.; Cassel, S.; Klaus, K.F.; Fettermann, T.; Key, N. Doctor Drone: Non-invasive Measurement of Humpback Whale Vital Signs Using Unoccupied Aerial System Infrared Thermography. Front. Mar. Sci. 2019, 6. [CrossRef] 24. Subhan, B.; Arafat, D.; Santoso, P.; Pahlevi, K.; Prabowo, B.; Taufik, M.; Kusumo, B.S.; Awak, K.; Khaerudi, D.; Ohoiulun, H.; et al. Development of observing dolphin population method using Small Vertical Take-off and Landing (VTOL) Unmanned Aerial System (AUV). Iop Conf. Ser. Earth Environ. Sci. 2019, 278, 012074. [CrossRef] 25. Landeo-Yauri, S.S.; Ramos, E.A.; Castelblanco-Martínez, D.N.; Niño-Torres, C.A.; Searle, L. Using small drones to photo-identify Antillean manatees: A novel method for monitoring an endangered marine mammal in the Caribbean Sea. Endanger. Species Res. 2020, 41, 79–90. [CrossRef] 26. Tagliafico, A.; Butcher, P.A.; Colefax, A.P.; Clark, G.F.; Kelaher, B.P. Variation in cownose ray Rhinoptera neglecta abundance and group size on the central east coast of Australia. J. Fish. Biol. 2020, 96, 427–433. [CrossRef] 27. Butcher, P.A.; Piddocke, T.P.; Colefax, A.P.; Hoade, B.; Peddemors, V.M.; Borg, L.; Cullis, B.R. Beach safety: Can drones provide a platform for sighting sharks? Wildl. Res. 2020, 46, 701–712. [CrossRef] 28. Raoult, V.; Tosetto, L.; Williamson, J.E. Drone-Based High-Resolution Tracking of Aquatic Vertebrates. Drones 2018, 2, 37. [CrossRef] 29. Benavides, M.T.; Fodrie, F.J.; Johnston, D.W. Shark detection probability from aerial drone surveys within a temperate estuary. J. Unmanned Veh. Sys. 2019, 8, 44–56. [CrossRef] Drones 2020, 4, 18 17 of 17

30. Hensel, E.; Wenclawski, S.; Layman, C.A.; Hensel, E.; Wenclawski, S.; Layman, C.A. Using a small, consumer-grade drone to identify and count marine megafauna in shallow habitats. Lat. Am. J. Aquat. Res. 2018, 46, 1025–1033. [CrossRef] 31. Wich, S. Chapter 7: Drones And Conservation. In Drones and Aerial Observation: New Technologies for Property Rights, Human Rights, and Global Development Drones: A Primer; New America: Washington, DC, USA, 2015. 32. UAVs in Surf Life Saving. Available online: https://www.surflifesaving.com.au/uavs-surf-life-saving (accessed on 9 March 2020). 33. Colefax, A.P.; Butcher, P.A.; Pagendam, D.E.; Kelaher, B.P. Reliability of marine faunal detections in drone-based monitoring. Ocean Coast. Manag. 2019, 174, 108–115. [CrossRef] 34. Verfuss, U.K.; Aniceto, A.S.; Harris, D.V.; Gillespie, D.; Fielding, S.; Jiménez, G.; Johnston, P.; Sinclair, R.R.; Sivertsen, A.; Solbø, S.A.; et al. A review of unmanned vehicles for the detection and monitoring of marine fauna. Mar. Pollut. Bull. 2019, 140, 17–29. [CrossRef] 35. Bryson, M.; Williams, S. Review of Unmanned Aerial Systems (UAS) for Marine Surveys; Australian Centre for Field Robotics, University of : Sydney, Australia, 2015. 36. Adams, K.; Broad, A.; Ruiz-Garcia, D.; Davis, A.R. Continuous wildlife monitoring using blimps as an aerial platform: A case study observing marine megafauna. Austral. Zool.. In Press. 37. RPAS Drones. Available online: https://www.casa.gov.au/drones (accessed on 8 March 2020). 38. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [CrossRef] 39. Van Gemert, J.C.; Verschoor, C.R.; Mettes, P.; Epema, K.; Koh, L.P.; Wich, S. Nature Conservation Drones for Automatic Localization and Counting of Animals. In Proceedings of the Computer Vision—ECCV 2014 Workshops; Agapito, L., Bronstein, M.M., Rother, C., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 255–270. 40. Gonzalez, L.F.; Montes, G.A.; Puig, E.; Johnson, S.; Mengersen, K.; Gaston, K.J. Unmanned Aerial Vehicles (UAVs) and Artificial Intelligence Revolutionizing Wildlife Monitoring and Conservation. Sensors 2016, 16, 97. [CrossRef] 41. Shrivakshan, G.T.; Chandrasekar, D.C. A Comparison of various Edge Detection Techniques used in Image Processing. Int. J. Comput. Sci. Issues 2012, 9, 269. 42. Sharma, N.; Scully-Power, P.; Blumenstein, M. Shark Detection from Aerial Imagery Using Region-Based CNN, a Study. In Proceedings of the AI 2018: Advances in Artificial Intelligence; Mitrovic, T., Xue, B., Li, X., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 224–236. 43. Surf—Beach in Kiama. Kiama NSW. Available online: http://beachsafe.org.au/beach/nsw/kiama/kiama/surf (accessed on 11 April 2020). 44. Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. arXiv 2016, arXiv:1612.08242. 45. Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. 46. Australian Shark Attack File. Taronga Conservation Society Australia. Available online: https: //taronga.org.au/conservation-and-science/australian-shark-attack-file#publishingthisinformation (accessed on 15 March 2020). 47. International Shark Attack File—Florida Museum of Natural History. Available online: https://www. floridamuseum.ufl.edu/shark-attacks/ (accessed on 15 March 2020). 48. Fisheries occasional publications. Available online: http://www.fish.wa.gov.au/About-Us/Publications/Pages/ Fisheries-Occasional-Publications.aspx (accessed on 15 March 2020). 49. Hodgson, A.; Kelly, N.; Peel, D. Unmanned Aerial Vehicles (UAVs) for Surveying Marine Fauna: A Dugong Case Study. Plos One 2013, 8, e79556. [CrossRef][PubMed] 50. The Things Network. Available online: https://www.thethingsnetwork.org/ (accessed on 15 March 2020). 51. About—Digital Living Lab. Available online: http://digitallivinglab.uow.edu.au/about/ (accessed on 15 March 2020). 52. TTN Mapper. Available online: https://ttnmapper.org/ (accessed on 15 March 2020).

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).