applied sciences

Article Autonomous Robotics for Identification and Management of Invasive Aquatic Plant Species

Maharshi Patel 1, Shaphan Jernigan 1, Rob Richardson 2 , Scott Ferguson 1 and Gregory Buckner 1,*

1 Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695, USA; [email protected] (M.P.); [email protected] (S.J.); [email protected] (S.F.) 2 Crop and Soil Sciences, North Carolina State University, Raleigh, NC 27695, USA; [email protected] * Correspondence: [email protected]

 Received: 29 April 2019; Accepted: 6 June 2019; Published: 13 June 2019 

Featured Application: Development of a small fleet of fully autonomous boats capable of subsurface invasive aquatic plant identification and treatment, consequently minimizing manual labor with more efficient, safe, and timely weed management.

Abstract: Invasive aquatic plant species can expand rapidly throughout water bodies and cause severely adverse economic and ecological impacts. While mechanical, chemical, and biological methods exist for the identification and treatment of these invasive species, they are manually intensive, inefficient, costly, and can cause collateral ecological damage. To address deficiencies in aquatic weed management, this paper details the development of a small fleet of fully autonomous boats capable of subsurface hydroacoustic imaging (to scan aquatic vegetation), machine learning (for automated weed identification), and herbicide deployment (for vegetation control). These capabilities aim to minimize manual labor and provide more efficient, safe (reduced chemical exposure to personnel), and timely weed management. Geotagged hydroacoustic imagery of three aquatic plant varieties (Hydrilla, Cabomba, and Coontail) was collected and used to create a software pipeline for subsurface aquatic weed classification and distribution mapping. Employing deep learning, the novel software achieved a classification accuracy of 99.06% after training.

Keywords: autonomous vehicles; robotics; machine learning; deep learning; image preprocessing; hydroacoustic sensing

1. Introduction

1.1. Impact and Treatment of Aquatic Weeds While native aquatic plants are essential components of aquatic ecosystems, non-native invasive species can expand rapidly throughout water bodies and cause severe economic and ecological impacts (Figure1)[ 1,2]. Adverse economic impacts include: impairing recreational activities (fishing, swimming, and boating); flooding caused by reduced drainage; hindering boat navigation; blocking water intakes for hydroelectric turbines, drinking water, and irrigation; and reducing the potability of fresh water due to foul taste and odor [1,3–5]. Indirect economic effects include reduced property values and reduced revenue from impacted businesses [2,6]. Aquatic flora consume at night; thus, excessive weed growth can deprive fish and other aquatic animals of this vital resource, leading to their death. Other ecological effects include overpopulation of small fish, which find shelter in the plants, and the creation of breeding habitats for some mosquito species [3]. Of the many invasive aquatic plant species, Hydrilla verticillata, commonly known by its genus Hydrilla, likely causes the

Appl. Sci. 2019, 9, 2410; doi:10.3390/app9122410 www.mdpi.com/journal/applsci Appl. Sci. 2019, 9, 2410 2 of 21

Appl.most Sci. economic 2019, 9, x FOR damage PEER REVIEW in the United States [1]. In Florida alone, state and federal entities2 spend of 21 millions of dollars annually towards Hydrilla management ($14.5 million in 1997 [1] and $66 million totaltotal from 20082008 toto 2015 2015 [ 7[7]]).). The The economic economic impact impact on on a single a single water water body body is significant; is significant; a detailed a detailed study studyon a single on a Florida single Florida lake estimated lake estimated potential potentialHydrilla -relatedHydrilla- lossesrelated of losses $40 million of $40 per million year (property per year (propertyvalues, agricultural values, agricultural support, flood support, control, flood recreation, control, recreation, etc.) [6]. etc.) [6].

(a) (b)

FigureFigure 1. 1. WatermealWatermeal infestation infestation in in a a central central North North Carolina Carolina pond: pond: ( (aa)) aerial aerial view; view; ( (bb)) ground ground view. view.

GivenGiven these these adverse adverse impacts, impacts, management management of of invasive invasive aquatic aquatic vegetation vegetation is is of of utmost utmost concern. concern. WhileWhile mechanical mechanical control control (plucking (plucking or orraking raking the the plants) plants) is the is most the most direct direct approa approach,ch, it can itbe can time be- consumingtime-consuming and costly and costly (estimated (estimated cost of cost $1000 of $1000per acre per in acre 1996) in 1996)[8,9]. Biological [8,9]. Biological control control methods methods (e.g., fish(e.g., which fish whichconsume consume large quantities large quantities of aquatic of aquaticweeds) are weeds) routinely are routinely employed employed but can be but counter can be- productive.counter-productive. One example One is example the grass is carp, the grass which carp, can which consume can and consume control and vegetation controlvegetation [9]; however, [9]; theirhowever, body their waste body compromises waste compromises water quality, water quality, killing sensitivekilling sensitive plants plantsand animal and animal species species,, and fertilizingand fertilizing additional additional weed weed growth growth [3,10] [3,.10 Biological]. Biological approaches approaches also also introduce introduce the the risk risk of of invasive invasive animalanimal species species [11] [11].. Seven Seven species species of of carp carp native native to to Asia, Asia, introduced introduced to to control control invasive invasive plant plant species, species, havehave recently recently spread spread up up the the Mississippi Mississippi River River system system and and have have crowded crowded out out native native fish fish populations. populations. WhileWhile not not without without its its drawbacks, chemical chemical control control (t (thehe application application of of herbicides) herbicides) is is considered considered the the mostmost attractive attractive aquatic aquatic weed weed management management method method in in terms terms of of minimizing minimizing cost, cost, time time,, and and collateral collateral damagedamage [5] [5].. EfficientEfficient chemical chemical-based-based vegetation vegetation control control in in aquatic aquatic bodies bodies requires requires proper proper identification identification and and ququantificationantification of of the the extent extent to to which which vegetation vegetation is is present, present, both both for for optimal optimal chemical chemical selection selection and and treatmenttreatment strategies strategies [3,4] [3,4].. The The approach approach of of manually manually identifying identifying weed weed-infested-infested regions regions within within a a water water bodybody and and applying applying herbicides herbicides at theat the target target location(s) location(s) is labor is intensive, labor intensive, time-consuming, time-consuming and involves, and involvesrisk of herbicide risk of herbicide exposure exposure to personnel to personnel [12]. Spraying [12]. Spraying herbicide herbicide throughout throughout the water the water body body leads leadsto much to much higher higher usage, usage, resulting resulting in economicin economic loss, loss, longer longer decomposition decomposition times, times, andand subsequent subsequent environmentalenvironmental haz .ards. TwoTwo unmanned watercraft, watercraft, the the TORMADA TORMADA (Lake (Lake Restoration Restoration Inc., Inc., Rogers, Rogers, MN) MN) and and WATER WATER STRIDERSTRIDER (Yamaha (Yamaha Motor Motor Company, Company, Iwata, Iwata, Shizuoka, Shizuoka, Japan) Japan) are commercially are commercially available available for herbicide for herbicidedispersal indispersal water bodies. in water These bodies. boats These offer boats remote offer control remote by acontrol human by operator a human but operator have limited but have tank limitedcapacity tank (3.8 capacity and 8.0 (3.8 L for and TORMADA 8.0 liters for and TORMADA WATER STRIDER, and WATER respectively) STRIDER, respectively) and lack autonomous and lack autonomousnavigation and navigation weed identification and weed functionality. identification Hänggi functionality. outlined the Hänggi design, fabrication, outlined the and design, testing fabrication,of an autonomous and testing boat of for an monitoring autonomous algae boat blooms for monitoring in Lake Geneva algae blooms and has in included Lake Geneva a summary and has of includedmultiple autonomous a summary watercraftof multiple developed autonomous by other watercraft authors developed [13]. These by prototypes other authors are intended [13]. These for a prototypesvariety of measurement are intended andfor a mapping variety of tasks, measurement but none are and purposed mapping for tasks aquatic, but vegetation none are purposed identification for aquaticand treatment. vegetation identification and treatment.

1.2. Image-Based Machine Learning: Application to Aquatic Weed Identification Automated plant identification, which relies on photographic images, is a well-developed methodology for agricultural applications. With recent advances in artificial intelligence, several machine learning algorithms have been developed for image classification. Current identification/classification methods involve image preprocessing followed by feature extraction. The

Appl. Sci. 2019, 9, 2410 3 of 21

1.2. Image-Based Machine Learning: Application to Aquatic Weed Identification Automated plant identification, which relies on photographic images, is a well-developed methodology for agricultural applications. With recent advances in artificial intelligence, several machine learning algorithms have been developed for image classification. Current Appl. Sci. 2019, 9, x FOR PEER REVIEW 3 of 21 identification/classification methods involve image preprocessing followed by feature extraction. Theextracted extracted features, features, which which typica typicallylly include include the thecolor, color, shape shape and and texture texture of leaves,of leaves, Histograms Histograms of Orientedof Oriented Gradients Gradients (HOG), (HOG), etc., etc., are are then then used used to to train train classifiers. classifiers. This This approach approach was was applied applied to identify certain Ficus plant species by acquiring photographic images of leaves, performing t thehe image preprocessing operations shown in in Figure 22,, andand extractingextracting featuresfeatures fromfrom thethe modifiedmodified imagesimages [[14]14].. These authors separately implemented a s supportupport v vectorector m machineachine (SVM) and a artificialrtificial n neuraleural n networketwork (ANN) as classifiers.classifiers. A A similar similar approach approach was was adopted adopted by by another another group group of of researchers researchers to to identify aquatic weeds growing on the water surface, namely Eichhornia crassipes , Pistia stratiotes stratiotes,, and Salvinia auriculata [15][15].. Visible Visible light light images images captured captured from from u unmannednmanned a aerialerial v vehiclesehicles (UAVs) (UAVs) were utilized to train multiple classifiers.classifiers. In In addition addition to to SVM, SVM, the the o optimum-pathptimum-path f forestorest classifier classifier (OPF) (OPF),, and Bayesian classifierclassifier were investigated.

(a) (b) (c) (d)

Figure 2. extractionExtraction from from an an image image of of a a leaf leaf (adapted (adapted from from [4]): [4]): ( (aa)) original original photograph; photograph; ( (b)) conversion conversion from RGB toto HSVHSV formatformat toto serveserve as as a a guide guide for for subsequent subsequent edge edge detection; detection; (c )(c grayscale) grayscale conversion conversion to toimprove improve image image contrast; contrast; and and (d )d) feature feature extraction extraction after after identifying identifying leaf leaf boundaries. boundaries.

In similarsimilar research,research, feature feature learning learning on on high high resolution resolution weed weed images images (water (water hyacinth, hyacinth, tropical tropical soda sodaapple, apple and,serrated and serrated tussock) tussock) was performedwas performed using using a filter a filter bank, bank, followed followed by image by image classification classification via viathe texton the texton approach approach (K-means (K-means clustering) clustering) [16]. Feature [16]. extraction Feature extractio and OPFn classifiers and OPF were classifiers successfully were successfullyimplemented implemented in another study in another to identify study invasive to identify yellow invasive flag iris yellow plants flag [17 iris]. plants [17]. Despite the documented success of these feature extraction extraction-based-based plant classification classification methods, they cannot be directly applied to subsurfacesubsurface plant identification identification due to limited visibility and the associated lack of photographic clarity clarity.. Despite Despite lacking lacking the the resolution resolution and and clarity clarity of of photography photography,, hydroacoustic imaging imaging is isan anemerging emerging option option for the for quantification the quantification of underwater of underwater vegetation. vegetation. Currently, Cuhydroacousticrrently, hydroacoustic data files can data be uploaded files can andbe uploaded processed and offline processed via a web-based offline via service a web (Navico-based BioBase), service which(Navico generates BioBase), biomass- which generates biomass maps.-concentration However, the maps. limited However, resolution theof limited hydroacoustic resolution data of hydroacousticimpedes feature data extraction impedes and feature typical extraction machine and learning typical approaches machine tolearning species approaches classification. to species classification.Deep learning is a more advanced machine learning technique that could overcome these limitationsDeep learning [18]. It employs is a more deep advanced neural networksmachine learning (DNNs) technique to simultaneously that could accomplish overcome feature these limitationsextractionand [18] classification. It employs tasksdeep [n19eural]. The networks fundamental (DNNs) component to simultaneously of a DNN is accomplish an artificial feature neural extractionnetwork (ANN). and classificat As shownion tasks in Figure [19]. 3The, a standardfundamental feedforward component ANN of a DNN consists is an of a multiplertificial n fullyeural nconnectedetwork (ANN). layers (input, As shown hidden, in Figure and output 3, a layers);standard each feedforward layer consists ANN of a consists varying of number multiple of nodes fully (artificialconnected neurons) layers (input, depending hidden, on and the output complexity layers); of theeach network. layer consists of a varying number of nodes T (artificialThe neurons) input layer dependingX = [ xon1, xthe2, ... complexityxn] takes ofn theextracted network. features, which are multiplied by an adaptable matrix W1 = [w1T, w1 , ... , w1 ; ... ; w1 , ... , w1 ], where h is the number of The input layer X= x,, x x 11 takes12 n extracted1h n 1 features,nh which are multiplied by an neurons in the hidden layer. 12 Thesen values are multiplied by another adaptable weight matrix adaptableW2 = [w2 weightw2 matrixw2 W1w=2 [ w 1 ,w w2 1 ,...,] w 1 ;...;m w 1 ,..., w 1 ], where h is the numberB2 of neurons 11, 12, ... , 1m; ... ; h1, ...11, hm 12, where 1his n 1the number nh of ANN outputs . Each ANN inoutput the corresponds hidden layer. to aspecific These values classification are multiplied (m classes in by this another case). By adaptable incorporating weight non-linear matrix activation2 2 functions 2 in 2 the output 2 layer 2 neurons (e.g., softmax or sigmoid), the output layer2 values can W= [ w11 , w 12 ,..., w 1m ;...; w h 1 ,..., w hm ] , where m is the number of ANN outputs B . Each ANN output corresponds to a specific classification (m classes in this case). By incorporating non-linear activation functions in the output layer neurons (e.g., softmax or sigmoid), the output layer values can be transformed to predict the classification for an input image. ANN are randomly initialized and optimized through a repetitive training process.

Appl. Sci. 2019, 9, 2410 4 of 21

beAppl. transformed Sci. 2019, 9, x to FOR predict PEER REVIEW the classification for an input image. ANN weights are randomly initialized4 of 21 andAppl. optimizedSci. 2019, 9, x throughFOR PEER a REVIEW repetitive training process. 4 of 21

FigureFigure 3. 3.Architecture Architecture of of a a feedforward feedforward artificial artificial neural neural network. network. Figure 3. Architecture of a feedforward artificial neural network. UnlikeUnlike standard standard ANNs, ANNs, DNNs DNNs integrate integrate feature feature extraction extraction by by incorporating incorporating several several additional additional Unlike standard ANNs, DNNs integrate feature extraction by incorporating several additional layerslayers [ 20[20]]..Convolutional Convolutional neuralneural networksnetworks (CNNs,(CNNs, FigureFigure4 ),4), one one of of the the most most popular popular subtypes subtypes of of layers [20]. Convolutional neural networks (CNNs, Figure 4), one of the most popular subtypes of DNNs,DNNs, replacereplace matrixmatrix multiplicationmultiplication with convolution in in the the initial initial layers layers.. Instead Instead of of multiplying multiplying all DNNs, replace matrix multiplication with convolution in the initial layers. Instead of multiplying all allinputs inputs (pixels (pixels of image) of image) with with different different weight weight sets, sets, a small a small window window of inputs of inputs is multiplied is multiplied by a weight by a inputs (pixels of image) with different weight sets, a small window of inputs is multiplied by a weight weightset and set the and window the window is iteratively is iteratively shifted shifted to cover to coverthe entire the entire inputinput matrix matrix (image) (image) while while using using the same the set and the window is iteratively shifted to cover the entire input matrix (image) while using the same sameweights. weights. This This decre decreasesases the the number number of of weights weights and and reduces reduces network network complexity and and associated associated weights. This decreases the number of weights and reduces network complexity and associated preprocessingpreprocessing [ 21[21].]. However,However, withwith significantlysignificantly deeperdeeper structuresstructures (as(as compared compared to to ANNs), ANNs), CNNs CNNs preprocessing [21]. However, with significantly deeper structures (as compared to ANNs), CNNs andand other other DNNs DNNs have have a a significantly significantly larger larger numbers numbers of of learnable learnable parameters. parameters. Consequently, Consequently, DNN DNN and other DNNs have a significantly larger numbers of learnable parameters. Consequently, DNN trainingtraining is is computationally computationally intensive intensive and andhas only has recently only recently become become viable due viable to advances due to in advances computer in training is computationally intensive and has only recently become viable due to advances in processing.computer processing. After training, After however,training, however, DNNs can DNNs be utilized can be utilized for real-time for real image-time classificationimage classification with computer processing. After training, however, DNNs can be utilized for real-time image classification minimalwith minimal processor processor capabilities. capabilities. with minimal processor capabilities.

FigureFigure 4. 4.Architecture Architecture of of a a convolutional convolutional neural neural network network (CNN) (CNN) [ 22[2].2]. Figure 4. Architecture of a convolutional neural network (CNN) [22]. 1.3.1.3. Research Research Objective Objective 1.3. Research Objective ToTo address address current current deficiencies deficiencies in in aquatic aquatic weed weed management, management, this this research research seeks seeks to to develop develop and and demonstratedemonstrateTo address aa small smallcurrent fleetfleet deficiencies of fully autonomous autonomous in aquatic weed boats boats management, capable capable of of subsurface subsurface this research hydroacou hydroacoustic seeks tostic develop imaging imaging and (to (todemonstratescan scan aquatic aquatic a small vegetation), vegetation), fleet of fully machine machine autonomous learning learning boats (for (for capable automated automated of subsurface weed identification), identification), hydroacoustic and and imaging herbicide herbicide (to deploymentscandeployment aquatic (for (for vegetation), vegetation vegetation machine control). control). learning TheseThese capabilitiescapabilities (for automated aimaim toto weedminimize minimize identification), manualmanual laborlabor and andand herbicide provide provide moredeploymentmore e ffiefficient,cient, (for safe safe vegetation (reduced (reduced control). chemical chemical These exposure exposure capabilities to to personnel), personnel), aim to and minimizeand timely timely weedmanual weed management. management. labor and provide more efficient, safe (reduced chemical exposure to personnel), and timely weed management. 2. Methods 2. Methods 2.1. Autonomous Boat Development 2.1. Autonomous Boat Development The first phase of research and development was the design and fabrication of water vehicles whichThe provided first phase the of following research functionality:and development fully wasautonomous the design navigation, and fabrication coordination of water between vehicles at which provided the following functionality: fully autonomous navigation, coordination between at

Appl. Sci. 2019, 9, 2410 5 of 21

2. Methods

2.1. Autonomous Boat Development Appl. Sci.The 2019 first, 9, phasex FOR PEER of research REVIEW and development was the design and fabrication of water vehicles5 of 21 which provided the following functionality: fully autonomous navigation, coordination between at least two vehicles, hydroacoustic data collection, variable rate herbicide application, and at least two hourshours of continuous operation without refuelrefuel/recharge/recharge (Figure(Figure5 5).). ForFor practicalpractical purposes,purposes, aa surfacesurface vehicle, rather rather than than a asubmarine submarine platform, platform, was was chosen. chosen. Other Other design design choices choices included included the use the of use a 15 of- agallon 15-gallon (56.8 (56.8 liter) L) herbicide herbicide tank, tank, a acapacity capacity recommended recommended by by the the herbicide herbicide manufacturer manufacturer sponsoring this research for the applications detailed detailed here; here; electric electric propulsion; propulsion; and and battery battery-based-based energy energy storage. storage. Additional design considerations and details areare presentedpresented below.below.

Figure 5. Autonomous boat prototype for identificationidentification and chemical treatment of invasive aquatic plant species. 2.1.1. Hull Design and Fabrication 2.1.1. Hull Design and Fabrication A multi-hull, V-shaped design was selected for each vehicle to provide optimal stability and drag A multi-hull, V-shaped design was selected for each vehicle to provide optimal stability and characteristics in the low-speed operating range (0.5–1.5 m/s) best suited to hydroacoustic mapping. drag characteristics in the low-speed operating range (0.5–1.5 m/s) best suited to hydroacoustic With the use of hydrostatic analysis, pontoon dimensions were computed to support the geometry and mapping. With the use of hydrostatic analysis, pontoon dimensions were computed to support the weight of fully loaded payload components (batteries, electronics, herbicide tank filled to 15 gallon geometry and weight of fully loaded payload components (batteries, electronics, herbicide tank filled capacity, electronics, propulsion system, etc.) with 50% or less pontoon submersion [23]. The resulting to 15 gallon capacity, electronics, propulsion system, etc.) with 50% or less pontoon submersion [23]. geometry of each pontoon was 244 26 30 cm (96.0 10.2 11.9 in), not including keel dimensions. The resulting geometry of each pontoon× × was 244 × 26 × 30 cm× (96.0 × 10.2 × 11.9 in), not including keel Production of fiberglass pontoons was a multi-step process involving the fabrication of polystyrene dimensions. plugs (Figure6a,b) and fiberglass molds (Figure6c) for the left and right halves of each pontoon, Production of fiberglass pontoons was a multi-step process involving the fabrication of bolting the mold halves together (Figure6d), and forming each pontoon within the mold (Figure6e). polystyrene plugs (Figure 6 a,b) and fiberglass molds (Figure 6c) for the left and right halves of each Pontoon caps were fabricated using a separate mold (Figure6f). A keel was integrated into the base of pontoon, bolting the mold halves together (Figure 6d), and forming each pontoon within the mold the pontoons for improved tracking performance. Multiple coats of polyester resin and liquid rubber (Figure 6e). Pontoon caps were fabricated using a separate mold (Figure 6f). A keel was integrated were applied to the surface of the pontoons and caps for enhanced waterproofing and cosmetics. into the base of the pontoons for improved tracking performance. Multiple coats of polyester resin and liquid rubber were applied to the surface of the pontoons and caps for enhanced waterproofing and cosmetics. Internal and external struts were waterjet-cut from aluminum sheets, bent to final geometry, and bolted between and within the pontoons (Figure 7). The struts provided enhanced lateral support and load-bearing capabilities. Low-density (32 kg/m3 or 2 lbf/ft3), high- urethane foam was added to the interior of the pontoons both to cradle batteries and electrical components and prevent sinking following capsizing or hull penetration. Hatch doors with water-resistant draw latches were installed above this region (Figure 5) to provide access to the batteries and electronics for battery recharging and electronics troubleshooting.

Appl. Sci. 2019, 9, 2410 6 of 21 Appl. Sci. 2019, 9, x FOR PEER REVIEW 6 of 21

(a)

(b) (c)

(d) (e) (f)

FigureFigure 6. 6. FabricationFabrication of of fiberglass fiberglass pontoons:pontoons: (a)) cutting, sanding, sanding, and and bonding bonding polystyrene polystyrene plug plug sections;sections; ( (bb)) epoxy epoxy coating, coating, sanding, sanding, and gel-coatinggel-coating of of plug; plug; ( (cc)) fiberglass fiberglass lay lay-up-up of of mold mold over over polystyrenepolystyrene plug; plug; ( d(d)) assembly assembly ofof moldmold halveshalves (shown with keel keel upward); upward); ( (ee)) fiberglass fiberglass pontoon pontoon af afterter lay-uplay-up inside inside mold mold and an applicationd application of multiple of multiple finish finish coats; coats; and ( f) and separately (f) separately cast pontoon cast pontoon cap installed. cap installed. Internal and external struts were waterjet-cut from aluminum sheets, bent to final geometry, and bolted2.1.2. Propulsion between and and within Steering the pontoons (Figure7). The struts provided enhanced lateral support and load-bearing capabilities. Low-density (32 kg/m3 or 2 lbf/ft3), high-buoyancy urethane foam was added An off-the-shelf electric trolling motor (Figure 8a, Minn Kota Powerdrive 45, Johnson Outdoors, toRacine, the interior WI, 200.2 of the N pontoonsthrust rating) both was to cradle selected batteries for propulsion, and electrical owing components to its relatively and preventhigh efficiency sinking followingand thrust capsizing capability or and hull its penetration. quiet operation. Hatch For doors remote with steering, water-resistant the factory draw-installed latches steering were installed motor abovewas interfaced this region to (Figure a motor5) controller to provide with access remote to the control batteries (RC and) in electronicsput and feedback for battery capabilities recharging (Pololu and electronicsJrk 12v12, troubleshooting.Pololu Corporation, Las Vegas, NV). Measurement of the shaft position for closed-loop control was implemented through a custom-mounted potentiometer rotationally coupled to the shaft via mechanical gears on each of these components (Figure 8b). A translating, spring-loaded frame and oversized gear teeth accommodated bow-to-stern shaft wobble, inherent in the system, to maintain gear meshing and avoid damage to the potentiometer. An airboat propulsion system, consisting of a direct current (DC) motor-driven propeller on a RC-servo-actuated rotating platform, was also evaluated. Despite its impressive steering capabilities (0.2 m turning radius) and low

Appl. Sci. 2019, 9, x FOR PEER REVIEW 7 of 21 entanglementAppl. Sci. 2019, 9, xrisk, FOR PEERthis systemREVIEW was thrust-limited (22.7 N max thrust), inefficient with regards7 of to21 power consumption, noisy, and highly sensitive to wind and wave disturbances. entanglement risk, this system was thrust-limited (22.7 N max thrust), inefficient with regards to Appl.Sci. 2019, 9, 2410 7 of 21 power consumption, noisy, and highly sensitive to wind and wave disturbances.

(a) (b) (a) (b) Figure 7. Aluminum struts bolted within the pontoons for enhanced lateral support: (a) before foam Figure 7. Aluminum struts boltedfilling); within (b) urethane the pontoons foamfor filling enhanced process. lateral support: (a) before foam Figure 7. Aluminum struts bolted within the pontoons for enhanced lateral support: (a) before foam filling); (b) urethane foam filling process. 2.1.3. Navigation and Control Unitfilling); (b) urethane foam filling process. 2.1.2. Propulsion and Steering 2.1.3.At Navigation a minimum, and Control autonomous Unit navigation requires a GPS receiver and magnetic compass for localizationAn off-the-shelf (sensing electric position trolling and heading) motor (Figure and 8 processora, Minn Kota for maintainingPowerdrive 45, real Johnson-time closed Outdoors,-loop At a minimum, autonomous navigation requires a GPS receiver and magnetic compass for controlRacine, WI, of steering200.2 N thrust and propulsionrating) was selected systems. for A propulsion, wireless transmitter/receiver owing to its relatively with high reasonable efficiency localization (sensing position and heading) and processor for maintaining real-time closed-loop transmissionand thrust capability range is and also its needed quiet operation. for communication For remote withsteering, a base the stationfactory-installed and other steering autonomous motor control of steering and propulsion systems. A wireless transmitter/receiver with reasonable wasvehicles interfaced in the to a fleet. motor The controller Pixhawk with autopilot remote control module, (RC) an input open and-hardware feedback capabilities device originally (Pololu transmission range is also needed for communication with a base station and other autonomous Jrkmanufactured 12v12, Pololu by Corporation, 3DR (3DR, Las Berkeley, Vegas, NV, CA), USA). was Measurement selected from of the the available shaft position alternatives for closed-loop for its vehicles in the fleet. The Pixhawk autopilot module, an open-hardware device originally widespreadcontrol was implementedusage and support through community a custom-mounted and its compatibility potentiometer with rotationally a wide range coupled of sensors to the shaftand manufactured by 3DR (3DR, Berkeley, CA), was selected from the available alternatives for its viasoftware mechanical (Figure gears 9a). on Mission each of these Planner components open-source (Figure software,8b). Atranslating, installed on spring-loaded a standard frame notebook and widespread usage and support community and its compatibility with a wide range of sensors and computeroversized (Dell gear teethLatitude accommodated 5550, Windows bow-to-stern 7 OS), coupled shaft wobble,with USB inherent-based telemetry in the system, radios, to served maintain as software (Figure 9a). Mission Planner open-source software, installed on a standard notebook thegear primary meshing on and-shore avoid interface damage between to the potentiometer. the human operator An airboat and the propulsion onboard system, autopilot consisting module computer (Dell Latitude 5550, Windows 7 OS), coupled with USB-based telemetry radios, served as (Figureof a direct 9b). current User-configurable (DC) motor-driven remote-control propeller transmitters on a RC-servo-actuated (FrSky Taranis rotating X9D Plus, platform, FrSky Electronic was also the primary on-shore interface between the human operator and the onboard autopilot module Cevaluated.o., Limited, Despite Jiangsu, its China), impressive each steering paired capabilitieswith an X8R (0.2 receiver m turning aboard radius) each andvehicle, low could entanglement be used (Figure 9b). User-configurable remote-control transmitters (FrSky Taranis X9D Plus, FrSky Electronic forrisk, direct this system control was of thrust-limiteda vehicle as desired (22.7 N (launching, max thrust), object ineffi cientavoidance, with regards etc.). Toggling to power a consumption, transmitter Co., Limited, Jiangsu, China), each paired with an X8R receiver aboard each vehicle, could be used switchnoisy, and alternated highly sensitivebetween autonomous to wind and waveand manual disturbances. control modes for the corresponding vehicle. for direct control of a vehicle as desired (launching, object avoidance, etc.). Toggling a transmitter switch alternated between autonomous and manual control modes for the corresponding vehicle.

(a) (b)

Figure 8. Propulsion and ( a steering) systems: ( (aa)) Minn Kota marine (b) propulsion unit unit and ( b) rotational potentiometer used for shaft position feedback. Figure 8. Propulsion and steering systems: (a) Minn Kota marine propulsion unit and (b) rotational potentiometer used for shaft position feedback.

Appl. Sci. 2019, 9, 2410 8 of 21

2.1.3. Navigation and Control Unit At a minimum, autonomous navigation requires a GPS receiver and magnetic compass for localization (sensing position and heading) and processor for maintaining real-time closed-loop control of steering and propulsion systems. A wireless transmitter/receiver with reasonable transmission range is also needed for communication with a base station and other autonomous vehicles in the fleet. The Pixhawk autopilot module, an open-hardware device originally manufactured by 3DR (3DR, Berkeley, CA), was selected from the available alternatives for its widespread usage and support community and its compatibility with a wide range of sensors and software (Figure9a). Mission Planner open-source software, installed on a standard notebook computer (Dell Latitude 5550, Windows 7 OS), coupled with USB-based telemetry radios, served as the primary on-shore interface between the human operator and the onboard autopilot module (Figure9b). User-configurable remote-control transmitters (FrSky Taranis X9D Plus, FrSky Electronic Co., Limited, Jiangsu, China), each paired with an X8R receiver aboard each vehicle, could be used for direct control of a vehicle as desired (launching,

Appl.object Sci. avoidance,2019, 9, x FOR etc.). PEER TogglingREVIEW a transmitter switch alternated between autonomous and manual8 of 21 control modes for the corresponding vehicle.

(a) (b)

FigureFigure 9. 9. FeaturesFeatures utilized utilized for for autonomous autonomous vehicle vehicle control: control: ( (aa)) Pixhawk Pixhawk autopilot autopilot module module as as installed installed withinwithin a a prototype prototype (receivers (receivers and and other other electronics electronics in in background); background); ( (bb)) Mission Mission Planner Planner interface interface showingshowing generation generation of of transects. transects.

HydroacousticHydroacoustic mapping mapping for for weed weed quantification quantification typically typically involves involves navigation navigation of of a a watercraft watercraft throughthrough a a series series of of parallel parallel linear linear trajectories, trajectories, commonly commonly known known as transects, as transects, spread spread throughout throughout the targetthe target region region (Figure (Figure 9b). 9Suchb). Such navigation navigation requires requires tracking tracking accuracy. accuracy. Transects, Transects, compiled compiled within Missionwithin Mission Planner, Planner, were loaded were to loaded the module to the modulevia telemetry via telemetry communication. communication. From the From Mission the Planner Mission interface,Planner interface, the operator the operatorcan also remotely can also remotely adjust autonomous adjust autonomous navigational navigational control parameters, control parameters, as well asas monitor well as monitornavigational navigational performance performance in real-time in real-time(actual vs. (actual desired vs. trajectories, desired trajectories, velocity, heading, velocity, etc.heading,; Figure etc.; 10). Figure The autopilot 10). The autopilotmodule implements module implements a version a of version L1 trajectory of L1 trajectory tracking tracking[24,25]. In [24 this,25]. method,In this method, a reference a reference point “L1_ref” point “L1_ref”on the desired on the trajectory desired trajectoryis calculated, is calculated,and the vehicle and theis directed vehicle (isthrough directed steering (through inputs) steering towards inputs) that towards point (Figure that point 10). L1(Figure tracking 10). period, L1 tracking a key period, user-defined a key parameter,user-defined determines parameter, the determines aggressiveness the aggressiveness of the vehicle in of reaching the vehicle the in L1_ref reaching point. the The L1_ref reference point. pointThe reference is kept sliding point constantly is kept sliding along constantly the required along trajecto thery. required Due to trajectory. trade-offs in Due parameter to trade-o tuningffs in (parametere.g., increasing tuning gains (e.g., can increasing lead to oversteering gains can and lead weaving to oversteering), the parameters and weaving), (primarily the L1 parameters tracking period(primarily and L1 proportional, tracking period integral, and proportional, and derivative integral, (PID and) controller derivative gains (PID)) were controller fine-tuned gains) withwere intensivefine-tuned experime with intensiventation. experimentation.

Figure 10. L1 tracking schematic showing computation of reference point (L1_ref) at various vehicle positions with respect to the desired path.

Appl. Sci. 2019, 9, x FOR PEER REVIEW 8 of 21

(a) (b)

Figure 9. Features utilized for autonomous vehicle control: (a) Pixhawk autopilot module as installed within a prototype (receivers and other electronics in background); (b) Mission Planner interface showing generation of transects.

Hydroacoustic mapping for weed quantification typically involves navigation of a watercraft through a series of parallel linear trajectories, commonly known as transects, spread throughout the target region (Figure 9b). Such navigation requires tracking accuracy. Transects, compiled within Mission Planner, were loaded to the module via telemetry communication. From the Mission Planner interface, the operator can also remotely adjust autonomous navigational control parameters, as well as monitor navigational performance in real-time (actual vs. desired trajectories, velocity, heading, etc.; Figure 10). The autopilot module implements a version of L1 trajectory tracking [24,25]. In this method, a reference point “L1_ref” on the desired trajectory is calculated, and the vehicle is directed (through steering inputs) towards that point (Figure 10). L1 tracking period, a key user-defined parameter, determines the aggressiveness of the vehicle in reaching the L1_ref point. The reference point is kept sliding constantly along the required trajectory. Due to trade-offs in parameter tuning (e.g., increasing gains can lead to oversteering and weaving), the parameters (primarily L1 tracking Appl.period Sci. 2019 and, 9proportional,, 2410 integral, and derivative (PID) controller gains) were fine-tuned 9 with of 21 intensive experimentation.

Figure 10. L1 tracking schematic showing computation of reference point (L1_ref) at various vehicle positions with respect to the desired path.path.

2.1.4. Herbicide Dispersal System For design of the herbicide dispersal system, Hydrilla was chosen as the targeted aquatic weed and Aquathol-K, a liquid-based chemical effective in the treatment of Hydrilla and other aquatic weeds, was selected as the treatment herbicide. A downstream chemical dilution system, with a drop hose outlet and an RC-controlled (via electronic speed controller, ESC) variable-speed pump, was utilized for dispersal. With this design, herbicide concentrate is stored within an onboard tank, diluted with water drawn from the lake, and dispersed through a hose submerged below the water surface. Dilution occurs at an injector downstream the pump, which passively combines the concentrate with lake water via the Venturi effect. Fluid analysis of the dispersal system and a series of tests were conducted for pump selection (8000 series SHURflow industrial pump, 100 psi rating) and to characterize chemical dispersal rates as a function of RC input (Figure 17b). These relationships allowed optimal RC pump control inputs to be determined as a function of transect spacing, boat velocity, desired application rate, and lake depth. Dispersal system design and analysis is further detailed in [23]. Design optimization methodologies, incorporating Stevin’s law of fluid statics, were used to optimize the placement of onboard components (batteries, electronics bin, chemical tank, propulsion system, dispersal pump, etc.). The objective of the algorithm, implemented with Excel Solver, was to minimize the pitch angle of the boat throughout all levels of tank payload (empty to full); a key constraint was avoidance of forward pitching (bow pitched downward). The methods and results, as detailed in [23], were adapted for later prototype generations as component dimensions, weights, and placement options (e.g., component storage within the pontoons), were altered.

2.1.5. Hydroacoustic Imaging An off-the-shelf hydroacoustic imaging system (Lowrance Elite 4 Chirp, Lowrance Electronics, Tulsa, OK) was purchased and integrated within each boat. Its transducer was silicone-mounted within the base of a plastic container at the fore region of the boat. The unit allowed geo-tagged transducer data to be stored on a micro secure digital (microSD) card in .sl2 file format.

2.2. Machine Learning for Aquatic Vegetation Classification The second phase of the project involved the development of automated methods for classifying and quantifying aquatic weeds, and successively utilizing this data to determine locations for targeted herbicide treatment. Appl. Sci. 2019, 9, 2410 10 of 21

2.2.1. Data Preprocessing The acquired .sl2 data, being in a binary format with no manufacturer-provided tools or guidelines for opening and reading the contents, could not be directly used for machine learning classification algorithms. To convert this .sl2 data into a series of standard digital images (.jpg format) with geotagging, a two-step approach was used. First, the .sl2 file was opened using the Reefmaster Viewer (ReefMaster Software Ltd., West Sussex, U.K.), a software program that enables the display of sonar data (hydroacoustic imagery and corresponding geographical location) in a continuous video form. Figure 11 is a screenshot of hydroacoustic data recorded at Lake Raleigh viewed using the Reefmaster Sonar Viewer software. Primary and DownScan™ images are displayed along with a map indicating the scanning location. Both images show the ground surface with aquatic plants. DownScan™ (bottom right in Figure 11) was selected since it provides a relatively clearer view of the ground and vegetation, compared to Primary scan. Imagery display options, including contrast and brightness, horizontal scaling, and color scheme, were selected within Reefmaster for maximum visualAppl. Sci. clarity. 2019, 9, x FOR PEER REVIEW 10 of 21

Figure 11. Screenshot from Reefmaster Sonar Viewer software illustrating hydroacoustic imagery Figure 11. Screenshot from Reefmaster Sonar Viewer software illustrating hydroacoustic imagery acquired on Lake Raleigh: left—map location corresponding with imagery (indicated by boat icon) and acquired on Lake Raleigh: left—map location corresponding with imagery (indicated by boat icon) traversed path (with increasing depth, path color changes from red to blue); top right—Primary scan and traversed path (with increasing depth, path color changes from red to blue); top right—Primary sonar; bottom right—DownScan™ sonar. scan sonar; bottom right—DownScan™ sonar. Next, still hydroacoustic images were acquired at fixed time intervals using the MATLAB Toolbox Next, still hydroacoustic images were acquired at fixed time intervals using the MATLAB ‘Screen Record’ by Nassim Khaled [26], which captures PC monitor contents in real-time. Playback Toolbox ‘Screen Record’ by Nassim Khaled [26], which captures PC monitor contents in real-time. of imagery at nine times the original speed expedited the capture process. The image capture rate Playback of imagery at nine times the original speed expedited the capture process. The image was selected to achieve 20%-30% spatial overlap between consecutive images. The toolbox code was capture rate was selected to achieve 20%‒30% spatial overlap between consecutive images. The modified to incorporate clock time instead of CPU time, creating images with consistent overlap toolbox code was modified to incorporate clock time instead of CPU time, creating images with by eliminating dependence on CPU load. Images were digitally masked and cropped to remove consistent overlap by eliminating dependence on CPU load. Images were digitally masked and unnecessary information and facilitate the machine learning process (Figure 12). The top left portion cropped to remove unnecessary information and facilitate the machine learning process (Figure 12). of the image contains the GPS coordinates, while DownScan™ covers the remaining portion. The top left portion of the image contains the GPS coordinates, while DownScan™ covers the remaining portion.

Figure 12. Geo-tagged DownScan™ image with GPS coordinates in the top left corner.

Appl. Sci. 2019, 9, x FOR PEER REVIEW 10 of 21

Figure 11. Screenshot from Reefmaster Sonar Viewer software illustrating hydroacoustic imagery acquired on Lake Raleigh: left—map location corresponding with imagery (indicated by boat icon) and traversed path (with increasing depth, path color changes from red to blue); top right—Primary scan sonar; bottom right—DownScan™ sonar.

Next, still hydroacoustic images were acquired at fixed time intervals using the MATLAB Toolbox ‘Screen Record’ by Nassim Khaled [26], which captures PC monitor contents in real-time. Playback of imagery at nine times the original speed expedited the capture process. The image capture rate was selected to achieve 20%‒30% spatial overlap between consecutive images. The toolbox code was modified to incorporate clock time instead of CPU time, creating images with consistent overlap by eliminating dependence on CPU load. Images were digitally masked and cropped to remove unnecessary information and facilitate the machine learning process (Figure 12).

Appl.The Sci. top2019 left, 9 , portion 2410 of the image contains the GPS coordinates, while DownScan™ covers11 of the 21 remaining portion.

Figure 12. GGeo-taggedeo-tagged DownScan™ DownScan™ imageimage with GPS coordinates in the top left corner.

2.2.2. Hardware and Software Configuration

As noted earlier, DNNs, being computationally-intensive, require powerful processing units for training in a reasonable timeframe. DNNs have millions of learnable parameters which undergo hundreds or thousands of optimization cycles during training. This precludes the usage of standard CPUs, which have a limited number of cores (eight in typical desktop computers), for DNN training. However, modern graphics processing units (GPUs), developed for gaming purposes, have thousands of cores which facilitate parallel computation via the Nvidia CUDA platform. For this research, a single Dell Precision T7500 workstation was configured with the following hardware: Intel Xeon CPU (2.13 GHz, 2 processors), 12GB RAM, and Nvidia GTX 1070 Ti GPU with 8 GB memory. The Nvidia CUDA deep neural network (cuDNN) library and other supporting software was installed to enable MATLAB, Python, and Google Colab to take advantage of the GPU.

2.2.3. DNN Training “Alexnet,” an advanced CNN supported by MATLAB’s neural network toolbox, was used in the algorithm for plant species classification. Figure 13 shows the layer-wise structure of Alexnet. Due to a limited availability of preliminary data, transfer learning was implemented on the pretrained CNN. Initial algorithm training focused on only two training classes: the target species (Hydrilla: 466 images; Figure 14a) and non-target species (other: 1751 images including some with no vegetation; Figure 14b). Images of each class were randomly divided into training and validation sets (with 420 and 46 images, respectively). The DNN achieved 100% training accuracy, indicating sufficient network complexity. However, significant differences between training and validation accuracies indicated overfitting. Appl. Sci. 2019, 9, 2410 12 of 21 Appl. Sci. 2019, 9, x FOR PEER REVIEW 12 of 21

Appl. Sci. 2019, 9,Figure x FOR PEER 13. Layer-wise REVIEW structure of Alexnet from MATLAB neural network toolbox. 13 of 21 Figure 13. Layer-wise structure of Alexnet from MATLAB neural network toolbox.

2.2.5. Generalizing Over Multiple Species Although image classification based on two classes (Hydrilla and Other) achieved satisfactory accuracy, classification across multiple plant species or classes was pursued to produce even better results with increased utility. To enable treatment of multiple species and enhance model generalization, hydroacoustic imagery of Cabomba (Figure 14c) and Coontail (Figure 14d, Ceratophyllum demersum), two common aquatic plant varieties, was collected and the DNN was trained on four classes. The data set was divided into training, validation, and test sets with 657, 80, and 80 images, respectively, of each class. Equal numbers of images in each class during training (a) (b) ensured prevention of sample bias; any significant difference was found to affect the classification probabilities. Cross verification on the test set following parameter tuning confirmed/ensured adequate model generalization.

2.2.6 Extracting GPS Coordinates from Images Post-Classification In conjunction with image classification, precise location of identified plant species is necessary for efficient treatment and recordkeeping. To create the required location database, the GPS coordinates superimposed on each hydroacoustic image (Figure 12) were extracted using the optical character recognition (OCR)(c) functionality of MATLAB. Image preprocessing(d) techniques, namely cropping, resizing, gray scaling, and binarizing, enabled extraction of accurate GPS coordinates from the classifiedFigureFigure 14. images ExampleExample (Figure hydroacoustic hydroacoustic 15). images images of each of each of the of weed the weed classes: classes: (a) Hydrilla (a) Hydrilla,(b) Other,, (b) ( c Other,) Cabomba (c) , Cabombaand (d) Coontail., and (d) Coontail.

2.2.4. Reducing Overfitting The primary causes of overfitting include insufficient training data for generalization [27] or excessive model complexity. Since the network structure already contained two dropout layers, data

(a) (b)

Figure 15. GPS coordinate extraction with image preprocessing: (a) raw/unprocessed hydroacoustic image and (b) preprocessed image optimized for optical character recognition (OCR). Generated output: N035 . 46.033↵W078 . 40 . 734↵↵.

3. Results

3.1. Autonomous Vehicle Performance Autonomous water vehicles were deployed into multiple water bodies for a variety of weather conditions, spanning seasons from mid-summer to early winter. The majority of trials were conducted on Lake Raleigh, a 75-acre lake located on North Carolina State University’s Centennial Campus, to evaluate the boats’ manual and autonomous navigational capabilities and confirm the herbicide dispersal system functionality.

3.1.1. Autonomous Navigation With the marine propulsion system, prototypes achieved speeds up to 2.3 m/s and were able to navigate in the presence of moderate wind (4.5 m/s, 10 mph), light rain, and choppy surface water conditions. An iterative controller tuning process was completed to ensure high-performance

Appl. Sci. 2019, 9, 2410 13 of 21

augmentation and model training with additional data were implemented to reduce overfitting [28,29]. Additional data was collected and Hydrilla images were generated with an increased overlap of approximately 50%, expanding the training and validation sets to 720 and 89 images, respectively.

Appl. Sci. 2019To, implement 9, x FOR PEER data REVIEW augmentation, artificial training data was generated by modifying13 original of 21 images via horizontal reflection, translation, and scaling. Furthermore, iterative parameter tuning was performedAppl. Sci. on2019 optimization, 9, x FOR PEERfunctions, REVIEW learning rates, learning rate drop schedules, batch sizes, and13 of data 21 augmentation parameters to increase classification accuracy.

2.2.5. Generalizing Over Multiple Species Although image classification based on two classes (Hydrilla and Other) achieved satisfactory accuracy, classification across multiple plant species or classes was pursued to produce even better results with increased utility. To enable treatment of multiple species and enhance model generalization, hydroacoustic imagery of Cabomba (Figure 14c) and Coontail (Figure 14d, Ceratophyllum demersum),

two common aquatic(a) plant varieties, was collected and the DNN was(b) trained on four classes. The data set was divided into training, validation, and test sets with 657, 80, and 80 images, respectively, of each (a) (b) class. Equal numbers of images in each class during training ensured prevention of sample bias; any significant difference was found to affect the classification probabilities. Cross verification on the test set following parameter tuning confirmed/ensured adequate model generalization.

2.2.6. Extracting GPS Coordinates from Images Post-Classification In conjunction with image classification, precise location of identified plant species is necessary for efficient treatment and recordkeeping. To create the required location database, the GPS coordinates

superimposed( onc) each hydroacoustic image (Figure 12) were extracted(d) using the optical character recognition (OCR) functionality(c) of MATLAB. Image preprocessing techniques,(d) namely cropping, Figureresizing, 14. Example gray scaling,hydroacoustic and binarizing,images of each enabled of the extraction weed classes: of ( accuratea) Hydrilla GPS, (b) Other, coordinates (c) from the Cabomba, and Figure(d) Coontail. 14. Example hydroacoustic images of each of the weed classes: (a) Hydrilla, (b) Other, (c) classifiedCabomba images, and (Figure (d) Coontail. 15).

(a) (a) (b) (b)

Figure 15.Figure GPSFigure coordinate 15. GPS15. GPS coordinate extraction coordinate extraction with extraction image with withpreprocessing: image image preprocessing: preprocessing: (a) raw/unprocessed ( a(a)) raw raw/unprocessed/unprocessed hydroacoustic hydroacoustic image andimage (bimage) preprocessed and ( andb) preprocessed (b) preprocessed image image optimized image optimized for optimized optical for optical forcharacter o characterptical rcecognitionharacter recognition recognition (OCR (OCR).). Generated ( GeneratedOCR). Generated output: output: N035N035.46.033 output:. 46.033 ↵N035W078.40.734W078 . 46.033 . 40 . 734↵W078.↵↵. . 40 . 734↵↵. 3. Results 3. Results 3. Results

3.1. Autonomous3.1. Autonomous Vehicle Vehicle Performance Performance 3.1. Autonomous Vehicle Performance AutonomousAutonomous water water vehicles vehicles were were deployed deployed into into multiple multiple water water bodiesbodies forfor a variety of of weather weather Autonomous water vehicles were deployed into multiple water bodies for a variety of weather conditions,conditions, spanning spanning seasons seasons from mid-summerfrom mid-summer to early to winter. early winter.The majority The majority of trials were of trials conducted were conditions, spanning seasons from mid-summer to early winter. The majority of trials were on Lakeconducted Raleigh, on Lake a 75-acre Raleigh, lake a located75-acre lake on North located Carolina on North State Carolina University’s State University’s Centennial Centennial Campus, conducted on Lake Raleigh, a 75-acre lake located on North Carolina State University’s Centennial to evaluateCampus, the to boats’evaluate manual the boats’ and manual autonomous and autonomous navigational navigational capabilities capabilities and confirm and the confirm herbicide the Campus, to evaluate the boats’ manual and autonomous navigational capabilities and confirm the dispersalherbicide system dispersal functionality. system functionality. herbicide dispersal system functionality. 3.1.1. Autonomous Navigation 3.1.1. Autonomous Navigation With the marine propulsion system, prototypes achieved speeds up to 2.3 m/s and were able to With thenavigate marine inpropulsion the presence system, of moderate prototypes wind achieved (4.5 m/s, speeds 10 mph), up light to 2.3 rain, m/s and and choppy were able surface to water navigate in theconditions. presence An of moderate iterative controllwind (4.5er tuningm/s, 10 process mph), light was rain, completed and choppy to ensure surface high water-performance conditions. An iterative controller tuning process was completed to ensure high-performance

Appl. Sci. 2019, 9, 2410 14 of 21

3.1.1. Autonomous Navigation With the marine propulsion system, prototypes achieved speeds up to 2.3 m/s and were able to navigate in the presence of moderate wind (4.5 m/s, 10 mph), light rain, and choppy surface Appl. Sci. 2019, 9, x FOR PEER REVIEW 14 of 21 water conditions. An iterative controller tuning process was completed to ensure high-performance tracking in in autonomous autonomous navigation navigation mode. mode. Figure Figure 16 shows 16 typical shows autonomous typical autonomous tracking performance. tracking Noperformance. degradation No indegradation tracking performance in tracking performance was observed was due observed to wind due gusts to wind or high gusts fluid or levelshigh fluid in the herbicidelevels in the tank, herbicide though tank, the maximumthough the speed maximum decreased speed with decreased heavier with herbicide heavier payloads. herbicide payloads.

starboard

port

(a) (b)

Figure 16. Tracking performance of a prototype at Lake Raleigh: (a) desired and actual trajectories asFigure displayed 16. Tracking from Missionperformance Planner of a softwareprototype and at Lake (b) corresponding Raleigh: (a) desired boat and deviations actual trajectories from intended as

trajectoriesdisplayed from (180 ◦ Missionturns excluded). Planner software and (b) corresponding boat deviations from intended trajectories (180° turns excluded). 3.1.2. Hydraulic Stability and Operational Depth 3.1.2.Since Hydraulic the required Stability submersion and Operational depth Depth of the trolling motor exceeds the pontoon draft, the boat can beSince operated the required at any depthsubmersion that can depth accommodate of the trolling marine motor props. exceeds Nonetheless, the pontoon with draft, the prop the boat motor positionedcan be operated at minimal at any depth depth that for shallowcan accommodate operation, marine angular props. travel Nonetheless, of the vertical with shaft the prop (for steeringmotor purposes)positioned must at minimal be software-limited depth for shallow to avoid operation, collision angular of the bladestravel of with the the vertical pontoons shaft during (for steering steering maneuvers.purposes) must It must be software be noted-limited that even to avoid with thiscollision angular of the limitation, blades with adequate the pontoons navigation duri wasng steering achieved. Bow-to-sternmaneuvers. It inclination must be noted of the that boat even was minimalwith this at angular all tank limitation, fill levels; adequate the boat exhibited navigation a slight,was almostachieved. indiscernible, Bow-to-stern upward inclination pitch of (bow the boat upward) was minimal at full tank at all capacity. tank fill levels; the boat exhibited a slight, almost indiscernible, upward pitch (bow upward) at full tank capacity. 3.1.3. Battery Life 3.1.3.Prototypes Battery Life were powered by two 12-volt lithium iron phosphate (LFP) batteries (Bioenno Power, SantaPrototypes Ana, CA, were USA) powered wired in by parallel, two 12-v eacholt lithium rated iron at a phosphate 60 amp-hour (LFP) capacity. batteries While (Bioenno the Power, batteries wereSanta notAna, operated CA) wired to depletion,in parallel, dataeach rated from at multiple a 60 amp tests-hour of capacity. over three While hours the suggests batteries up were to eightnot hoursoperated of operationto depletion, time data can befrom achieved multiple on tests a single of over charge three without hours falling suggests below up safeto eight (80% hours discharge) of levels.operation Battery time can discharge be achieved monitoring on a single (Figure charge 17a) without revealed falling a noticeable below safe decrease (80% discharge) in battery levels. voltage overBattery approximately discharge monitoring the first 15 (Figure min followed17a) revealed by a a relatively noticeable steady decrease voltage in battery over thevoltage remaining over operationapproximately period. the first 15 min followed by a relatively steady voltage over the remaining operation period. 3.1.4. Herbicide Dispersal System 3.1.4.Laboratory Herbicide Dispersal testing, completed System before lake trials, revealed that the dispersal system could provide moderateLaboratory application testing rates, completed of Aquathol-K before (1.0-1.8 lake trials, gal/ A-ft) revealed for typical that the pond dispersal depths, syste boatm velocities, could andprovide transect moderate widths. application Figure 17 ratesb shows of Aquathol the range-K of (1.0 chemical-1.8 gal/A dispersal-ft) for rates typical (Aquathol pond depths, concentrate) boat achievablevelocities, and by varyingtransect the widths. RC input. Figure 17b shows the range of chemical dispersal rates (Aquathol concentrate) achievable by varying the RC input. Functionality of the dispersal system with chemical dilution modality was also demonstrated in Lake Raleigh; water was substituted for the herbicide to avoid unnecessary release of chemicals into the water body. Herbicide application was completed in a small (approximately 0.29 acres) private pond with watermeal infestation (Figure 18). Due to the small surface area and volume of the pond, a premixed herbicide/water was dispersed directly from the tank and manual control, rather than autonomous navigation, was utilized.

Appl. Sci. 2019, 9, x FOR PEER REVIEW 15 of 21 Appl. Sci. 2019, 9, 2410 15 of 21

Appl. Sci. 2019, 9, x FOR PEER REVIEW 15 of 21

(a)

(a)

(b)

FigureFigure 17. 17.(a) (a Battery) Battery voltage voltage measurements measurements for for multiple multiple testtest runs and and ( (bb)) characterization characterization of of chemical chemical dispersaldispersal rates rates as aas function a function of RCof RC input. input. Voltage Voltage fluctuations fluctuations were were a functiona function of of power power supplied supplied to to the propulsionthe propulsion system. system.

Functionality of the dispersal system with chemical dilution modality was also demonstrated in Lake Raleigh; water was substituted for the herbicide to avoid unnecessary release of chemicals into the water body. Herbicide application was completed in a small (approximately 0.29 acres) private (b) pond with watermeal infestation (Figure 18). Due to the small surface area and volume of the pond, a premixed herbicideFigure 17. (a/)water Battery solutionvoltage measurements was dispersed for multiple directly test runs from and the(b) characterization tank and manual of chemical control, rather dispersal rates as a function of RC input. Voltage fluctuations were a function of power supplied to than autonomous navigation, was utilized. the propulsion system.

Figure 18. Treatment of watermeal infestation in a small pond using manual navigation and minimal prop submersion.

Figure 18. Treatment of watermeal infestation in a small pond using manual navigation and minimal Figure 18. Treatment of watermeal infestation in a small pond using manual navigation and minimal prop submersion.prop submersion.

Appl. Sci. 2019, 9, 2410 16 of 21 Appl. Sci. 2019, 9, x FOR PEER REVIEW 16 of 21

3.2.3.2. Machine Machine Learning Learning Algorithm Algorithm

Vegetation3.2.1. Vegetation Classification Classification TheThe deep deep learning learning algorithm, algorithm, in in conjunctionconjunction with with data data augmentation, augmentation, successfully successfully extracted extracted featuresfeatures and and accurately accurately classified classified underwater underwater vegetation vegetation using using hydroacoustic hydroacoustic imagery. imagery. Figure Figure 19 19 showsshows classification classification accuracy accuracy asas aa functionfunction of of training training iterations iterations for for both both training training and and validation validation sets; sets; it clearlyit clearly illustrates illustrates how how data data augmentationaugmentation reduced overfitting overfitting in in the the validation validation sets. sets.

(a)

(b)

FigureFigure 19. 19.Training Training progress progress ((aa)) before before and and (b) following data data augmentation. augmentation. Overfitting Overfitting in ( ina) (isa) is evidentevident through through high high variation variation betweenbetween training and and validation validation accuracy. accuracy.

InitialInitial parameter parameter tuning tuning readilyreadily improved the the validation validation accuracy accuracy to to approximately approximately 97%; 97%; further further parameterparameter tuning tuning producedproduced only only marginal marginal gains. gains. The Theoptimizers optimizers ‘sgdm’, ‘sgdm’, ‘rmsprop’ ‘rmsprop’,, and ‘adam’ and gave ‘adam’ gavesimilar similar performance, performance, with with ‘sgdm’ ‘sgdm’ and and ‘adam’ ‘adam’ producing producing marginally marginally better better accuracy. accuracy. Increased Increased smoothnesssmoothness of of the the training training curve curve and and improved improved accuracy accuracy were were observed observed for all for optimizers all optimizers by reducing by

Appl. Sci. 2019, 9, 2410 17 of 21 Appl. Sci. 2019, 9, x FOR PEER REVIEW 17 of 21 thereducing learning the rate. learning A learning rate. A learning rate drop rate schedule drop schedule further further improved improved the the accuracy accuracy as as compared compared to a constantto a constant learning learning rate. rate. The The “MiniBatch “MiniBatch Size” Size” parameter parameter contributed heavily heavily to to runtimes runtimes and and smoothnesssmoothness of of the the training training curve. Greater Greater batch batch sizes sizes smoothed smoothed the curve the curve and reduced and reduced runtimes, runtimes, but butincreased increased memory memory requirements. requirements. Limiting Limiting batch batch sizes sizes to 256to 256 or or less less helped helped avoid avoid the the “sharp “sharp minimizers” that tend to impede generalization [30]. minimizers” that tend to impede generalization [30]. Following parameter tuning, the DNN achieved a classification accuracy of 99.06% for both the Following parameter tuning, the DNN achieved a classification accuracy of 99.06% for both validation and test data sets, clearly indicating excellent generalization (Cabomba: precision = 1, recall the validation and test data sets, clearly indicating excellent generalization (Cabomba: precision = 1, = 1; Coontail: precision = 1, recall = 1; Hydrilla: precision = 0.9873, recall = 0.975; Other: precision = recall = 1; Coontail: precision = 1, recall = 1; Hydrilla: precision = 0.9873, recall = 0.975; Other: precision 0.9753, recall = 0.9875). Analysis of the confusion matrices for different tuning parameters revealed = 0.9753,more misclassifications recall = 0.9875). within Analysis the ofHydrilla the confusion and Other matrices classes for(and di ffbetweenerent tuning the two) parameters than within revealed the moreCabomba misclassifications and Coontail withinclasses. theFigureHydrilla 20 showsand the Other confusion classes matrices (and between for the theoptimal two) parameter than within set. the Cabomba and Coontail classes. Figure 20 shows the confusion matrices for the optimal parameter set.

(a)

(b)

FigureFigure 20. 20.Confusion Confusion matrices matrices ofof classifiedclassified vegetation species species for for (a (a) )validation validation and and (b ()b test) test images. images.

SubsequentSubsequent analysis analysis revealed revealed thatthat misclassifiedmisclassified Hydrilla imagesimages tended tended to to be be associated associated with with limitedlimited plant plant growth growth (Figure (Figure 21 21a).a). TheThe majoritymajority of images containing containing CabombaCabomba andand Coontail Coontail were were associated with more mature plant growth, which perhaps led to higher classification accuracies.

Appl.Appl. Sci. Sci.2019 2019, 9, ,9 2410, x FOR PEER REVIEW 18 of18 21 of 21

associated with more mature plant growth, which perhaps led to higher classification accuracies. This Thissuggests suggests thatthat timing timing of data of data capture capture within within the growing the growing season season might might play anplay important an important role in role inclassification classification accuracy. accuracy. Data Data for forHydrillaHydrilla was wascollected collected from fromJune to June November to November while Cabomba while Cabomba and andCoontail Coontail were were scanned scanned during during December, December, a time aof time year of associated year associated with more with mature more growth. mature Other growth. Otherinstances instances of misclassification of misclassification included included images imagescontaining containing floating vegetation floating vegetation or schools orof schoolsfish (Figure of fish (Figure21b), and 21b), vegetation and vegetation being only being partially only partially contained contained within the within image the (Figure image 21c). (Figure 21c).

(a)

(b)

(c)

FigureFigure 21. 21.Misclassified Misclassified imagesimages due to ( (aa)) limited limited vegetation vegetation growth, growth, (b ()b floating) floating vegetation vegetation or schools or schools ofof fish, fish, and and ( c()c) vegetation vegetation partiallypartially containedcontained within within the the image. image.

Appl.Appl. Sci. Sci. 20192019, ,99,, x 2410 FOR PEER REVIEW 1919 of of 21 21

During DNN training, weights of all layers undergo continuous optimization. Successful feature learningDuring creates DNN smooth training, filters weights with uniform of all layers weight undergo gradients. continuous Figure optimization.22 visualizes the Successful 96 filters feature (each withlearning 11 × creates 11 × 3 smoothweights) filters of the with first uniform convolutional weight gradients.layer for two Figure separate 22 visualizes cases. Parameter the 96 filters tuning (each with 11 11 3 weights) of the first convolutional layer for two separate cases. Parameter tuning generated× smoother,× blended (less discretized) gradient patterns, which are characteristics of effectivegenerated training smoother, (Figure blended 22b) (less[31], discretized)while filters gradient in Figure patterns, 22a indicate which lack are of characteristics sufficient training. of effective training (Figure 22b) [31], while filters in Figure 22a indicate lack of sufficient training.

(a) (b)

FigureFigure 22. 22. DDeepeep n neuraleural n networksetworks ( (DNN)DNN) weight weight visualization visualization for for first first convolutional convolutional layer layer ( (aa)) before before parameterparameter optimization optimization and and ( (b)) after after training training with with parameter parameter optimization. optimization.

4.4. Discussion Discussion ThisThis research research was was successful successful in in developing developing the the necessary necessary hardware hardware and and software software for for automated automated identificaidentificationtion and and treatment treatment of of submerged submerged aquatic aquatic weeds. weeds. A A small small fleet fleet (two (two watercraft) watercraft) with with the the requiredrequired systems systems and and capabilities (self(self-navigation,-navigation, hydroacou hydroacousticstic data data collection collection,, and and herbicide herbicide dispersal)dispersal) were were developed developed over over three three prototype generations. Hydrilla,Hydrilla, bebeinging a a very very invasive invasive weed weed speciesspecies with with severe severe impacts, impacts, was was chosen chosen as asthe the targeted targeted aquatic aquatic weed. weed. The Thedeep deep learning learning model model was laterwas later extended extended to multiple to multiple species, species, namely namely HydrillaHydrilla, Cabomba, Cabomba, and, and Coontail Coontail (four (four classes classes in in total, total, includingincluding “Other”).“Other”). High High classification classification accuracy accuracy of 99.06% of 99.06% on both on both the validation the validation and test and sets test indicate sets indicateexcellent excellent generalization. generalization. Training Training on four classes on four with classes a higher with volumea higher of volume data improved of data improved the algorithm the algorithmaccuracy asaccuracy compared as compared to training to on training two classes. on two Geotagged classes. Geotagged information information from classified from images classified was imagesreliably was extracted reliably for extracted treatment for of treatment the target of areas the target with herbicides. areas with herbicides. WhileWhile the the integration integration of of autonomous autonomous scanning, scanning, identification identification,, and and treatment treatment of of invasive invasive plant plant speciesspecies was was to to some some extent extent limited limited by by software software and and hardware hardware capabil capabilities,ities, the the current current practice practice of manuallyof manually scanning scanning and and treating treating entire entire water water bodies bodies can be can eliminated be eliminated using using this approach. this approach. The technologiesThe technologies detailed detailed here here clearly clearly have have the thepotential potential to toreduce reduce the the cost cost and and increase increase the the effectiveness effectiveness ofof aquatic aquatic plant plant management management.. Future Work 4.1. Future Work Because plant maturity was found to have an impact on classification accuracy, it may be advisable Because plant maturity was found to have an impact on classification accuracy, it may be to collect data of each plant variety throughout the growing season and at multiple locations to advisable to collect data of each plant variety throughout the growing season and at multiple help further generalize the model. Future research will likely focus on improved hardware and locations to help further generalize the model. Future research will likely focus on improved software integration, which will enable synchronous execution of all system tasks: hydroacoustic hardware and software integration, which will enable synchronous execution of all system tasks: data collection, weed classification and distribution mapping, and targeted herbicide treatment. hydroacoustic data collection, weed classification and distribution mapping, and targeted herbicide treatment. The integrated process would involve automated weed classification during

Appl. Sci. 2019, 9, 2410 20 of 21

The integrated process would involve automated weed classification during hydroacoustic scanning, which could be accomplished through one of several methods. The first option involves performing all classification and location extraction tasks on a central computer, located with the operator. In this option, the hydroacoustic data could be transferred to the central computer using existing hydroacoustic wireless technology (e.g., the “Navico gofree wifi1”), a wireless module (Navico Marine Electronics, Egersund, Norway), or a Wi-Fi enabled SD card. To overcome limited transmission range, transects could be configured to periodically direct boats close to the operator for wireless data transmission. Following weed classification and target location identification by the computer, an operator could generate transects covering these locations and transmit them to the vehicles for herbicide application. Multiple boats could be employed simultaneously for weed detection and treatment with a single operator (based on the shore or in another watercraft, (e.g., canoe or motorized craft)). The second option involves the use of onboard computer processing systems (e.g., Raspberry Pi units or tablets), for real-time image classification. Live data could be transferred from the terminals of the fish finder to the processor using Wireshark software [32]. The herbicide application system could be triggered immediately upon detection of targeted weed species. Applying this methodology, scanning and treatment could be completed in a single completion of the transects. In spite of its benefits, this method would require extensive research to successfully configure real-time communication between software (Wireshark, Reefmaster, MATLAB, Mission Planner) and hardware components (hydroacoustic scanner, autopilot module, etc.).

Author Contributions: Conceptualization, G.B., S.F. and R.R.; methodology: G.B., S.J., S.F. and M.P.; software, S.J. and M.P.; validation, S.J. and M.P.; data collection and processing, R.R., M.P. and S.J.; writing—original draft preparation, S.J. and M.P.; writing—review and editing, G.B. and M.P.; visualization, supervision, G.B., S.F. and R.R.; project administration, G.B. and S.F.; funding acquisition, R.R. and G.B. Funding: This research was funded by United Phosphorous, Inc. and the North Carolina Policy Collaboratory, grant numbers 2016-1503 and 2019-0236, respectively. Acknowledgments: The authors would like to acknowledge the generous technical support of Stephen Hoyle and Andrew Howell (Dept. of Crop and Soil Sciences, N.C. State University), Justin Nawrocki and Gerald Adrian (United Phosphorus, Inc.), and Eric Stewart, Gordon Beverley and Esther Lee (Dept. of Mechanical and Aerospace Engineering, N.C. State University). Conflicts of Interest: The authors declare no conflicts of interest.

References

1. Pimentel, D.; Zuniga, R.; Morrison, D. Update on the environmental and economic costs associated with alien-invasive species in the United States. Ecol. Econ. 2005, 52, 273–288. [CrossRef] 2. Rockwell, H.W. Summary of a Survey of the Literature on the Economic Impact of Aquatic Weeds; Aquatic Ecosystem Restoration Foundation: Flint, MI, USA, 2003. 3. Gettys, L.A.; Haller, W.T.; Petty, D.G. Biology and Control of Aquatic Plants: A Best Management Practices Handbook; Aquatic Ecosystem Restoration Foundation: Marietta, GA, USA, 2019. 4. McComas, S. Lake and Pond Management Guidebook; CRC Press: Boca Raton, FL, USA, 2003. 5. Lembi, C.A. Identifying and Managing Aquatic Vegetation; Formerly Purdue Extension Publication WS-21-W; Purdue University Cooperative Extension Service: West Lafayette, IN, USA, 2009. 6. Bell, F.W.; Bonn, M.A. Economic Sectors at Risk from Invasive Aquatic Weeds at Lake Istokpoga, Florida. 2004. Available online: http://www.aquatics.org/pubs/economics.htm (accessed on 7 June 2019). 7. Buck, B. UF/IFAS Researchers Try to Cut Costs to Control Aquatic Invasive Plants in Florida; University of Florida Institute of Food and Agricultural Sciences IFAS Blogs: Gainesville, FL, USA, 2016. 8. Langeland, K.A. Hydrilla verticillata (L. F.) Royle (Hydrocharitaceae), The Perfect Aquatic Weed. South. Appalach. Bot. Soc. 1996, 61, 293–304. 9. Langeland, K.A.; Enloe, S.F.; Gettys, L. Hydrilla Management in Florida Lakes; U.S. Department of Agriculture UF/IFAS Extension: Gainesville, FL, USA, 2012. 10. Bain, M.B. Assessing impacts of introduced aquatic species: Grass carp in large systems. Environ. Manag. 1993, 17, 211–224. [CrossRef] Appl. Sci. 2019, 9, 2410 21 of 21

11. Helfrich, L.; Neves, R.; Libey, G.; Newcomb, T. Control Methods for Aquatic Plants in Ponds and Lakes. Available online: https://vtechworks.lib.vt.edu/handle/10919/48945 (accessed on 1 April 2019). 12. Blanco, A.; Qu, J.J.; Roper, W.E. Spectral signatures of hydrilla from a tank and field setting. Front. Earth Sci. 2012, 6, 453–460. [CrossRef] 13. Hänggi, T. Design of an Autonomous Sampling Boat for the Study of Algae Bloom in Lake Zurich. Master’s Thesis, Swiss Federal Institute of Technology Zurich, Zurich, Switzerland, 2009. 14. Kho, S.J.; Manickam, S.; Malek, S.; Mosleh, M.; Dhillon, S.K. Automated plant identification using artificial neural network and support vector machine. Front. Life Sci. 2017, 10, 98–107. [CrossRef] 15. Pereira, L.A.M.; Nakamura, R.Y.M.; de Souza, G.F.S.; Martins, D.; Papa, J.P. Aquatic weed automatic classification using machine learning techniques. Comput. Electron. Agric. 2012, 87, 56–63. [CrossRef] 16. Hung, C.; Xu, Z.; Sukkarieh, S. Feature learning based approach for weed classification using high resolution aerial images from a mounted on a UAV. Remote Sens. 2014, 6, 12037–12054. [CrossRef] 17. Baron, J.; Hill, D.J.; Elmiligi, H. Combining image processing and machine learning to identify invasive plants in high-resolution images. Int. J. Remote Sens. 2018, 39, 5099–5118. [CrossRef] 18. Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. 19. Nielsen, M. Neural Networks and Deep Learning; Determination Press: San Francisco, CA, USA, 2015. 20. Sun, Y.; Liu, Y.; Wang, G.; Zhang, H. Deep Learning for Plant Identification in Natural Environment. Comput. Intell. Neurosci. 2017, 2017, 7361042. [CrossRef][PubMed] 21. Liu, W.; Wang, Z.; Liu, X.; Zeng, N.; Liu, Y.; Alsaadi, F.E. Neurocomputing A survey of deep neural network architectures and their applications. Neurocomputing 2017, 234, 11–26. [CrossRef] 22. Convolutional Neural Network: 3 Things You Need to Know. Available online: https://www.mathworks. com/solutions/deep-learning/convolutional-neural-network.html (accessed on 6 June 2019). 23. Beverly, G.T. Development and Experimentation of an Herbicide Dispersal System for an Autonomous Aquatic Weed Management System. Master’s Thesis, North Carolina State University, Raleigh, NC, USA, 2017. 24. Park, S.; Deyst, J.; How, J. A New Nonlinear Guidance Logic for Trajectory Tracking. In Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, Providence, RI, USA, 16–19 August 2004; pp. 1–16. 25. Jones, B. Plane: L1 Control for Straight and Curved Path Following. 2013. Available online: https://github.com/ ArduPilot/ardupilot/pull/101 (accessed on 7 June 2019). 26. Khaled, N. Screen Record. MathWorks File Exchange; 2008. Available online: https://www.mathworks.com/ matlabcentral/fileexchange/21216-screen-record (accessed on 8 November 2018). 27. Domingos, P. A Few Useful Things to Know About Machine Learning. Commun. ACM 2012, 55, 78–87. [CrossRef] 28. Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. 29. Krizhevsky,A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1–9. [CrossRef] 30. Keskar, N.; Mudigere, D.; Nocedal, J.; Smelyanskiy, M.; Tang, P. On large-batch training for deep learning: Generalization gap and sharp minima. In Proceedings of the International Conference on Learning Representations, Toulon, France, 24–26 April 2017; pp. 1–16. 31. Understanding and Visualizing Convolutional Neural Networks. Available online: http://cs231n.github.io/ understanding-cnn/ (accessed on 7 June 2019). 32. Dabrowski, A.; Stelzer, R. A Digital Interface for Imagery and Control of a Navico/Lowrance Broadband Radar. In Breizh Spirit, a Reliable Boat for Crossing the Atlantic Ocean; Springer: Berlin/Heidelberg, Germany, 2011; pp. 169–181.

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).