applied sciences

Article Development of UAV Tracing and Coordinate Detection Method Using a Dual-Axis Rotary Platform for an Anti-UAV System

Bor-Horng Sheu 1, Chih-Cheng Chiu 2, Wei-Ting Lu 1, Chu-I Huang 1 and Wen-Ping Chen 1,3,* 1 Department of Electrical Engineering, National Kaohsiung University of Science and Technology, Kaohsiung City 80778, Taiwan 2 Engineering Science and Technology, College of Engineering, National Kaohsiung University of Science and Technology, Kaohsiung City 80778, Taiwan 3 Ph.D Program in Biomedical Engineering, Kaohsiung Medical University, Kaohsiung City 80778, Taiwan * Correspondence: [email protected]

 Received: 22 May 2019; Accepted: 21 June 2019; Published: 26 June 2019 

Abstract: The rapid development of unmanned aerial vehicles (UAVs) has led to many security problems. In order to prevent UAVs from invading restricted areas or famous buildings, an anti-UAV defense system (AUDS) has been developed and become a research topic of interest. Topics under research in relation to this include electromagnetic interference guns for UAVs, high-energy laser guns, US military net warheads, and AUDSs with net guns. However, these AUDSs use either manual aiming or expensive radar to trace drones. This research proposes a dual-axis mechanism with UAVs automatic tracing. The tracing platform uses visual image processing technology to trace and lock the dynamic displacement of a drone. When a target UAV is locked, the system uses a nine-axis attitude meter and laser rangers to measure its flight altitude and calculates its longitude and latitude coordinates through sphere coordinates to provide drone monitoring for further defense or attack missions. Tracing tests of UAV flights in the air were carried out using a DJI MAVIC UAV at a height of 30 m to 100 m. It was set up for drone image capture and visual identification for tracing under various weather conditions by a thermal imaging camera and a full-color camera, respectively. When there was no cloud during the daytime, the images acquired by the thermal imaging camera and full-color camera provide a high-quality image identification result. However, under dark weather, black clouds will emit radiant energy and seriously affect the capture of images by a thermal imaging camera. When there is no cloud at night, the thermal imaging camera performs well in drone image capture. When the drone is traced and locked, the system can effectively obtain the flight altitude and longitude and latitude coordinate values.

Keywords: unmanned aerial vehicles (UAVs), dynamic coordinate tracing; computer vision; anti-UAV defense system

1. Introduction Since Dà-Jiang¯ Innovations (DJI) released the new DJI XP3.1 (a flight control system) [1,2], its multi-rotor drone has been characterized by hovering, route-planning, its comparative low price, and ease of use [3,4]. These factors have led to the rapid development of drones in the consumer selfie drone market. At present, they have been adopted all over the world in environmental aerial photography or surveillance. However, under uncontrolled drone flight control, there have been many drone misfortunes, both at home and abroad. News events have included the impact on the navigation area, the impact on famous buildings, and the horror of political figures, and the biggest worry for drones is that people or terrorists may modify them to make them carry dangerous bombs,

Appl. Sci. 2019, 9, 2583; doi:10.3390/app9132583 www.mdpi.com/journal/applsci Appl. Sci. 2019, 9, 2583 2 of 17 or use the vehicles themselves as extremely dangerous devices for attacking bombs. Drone potential hazards are an urgent problem [5,6], so countries have begun to impose strict regulations on drone traffic management, and even implement no-fly rules. The Federal Aviation Administration (FAA), similar authorities of China and Taiwan have all drawn up drone system usage scenarios. Besides, for the sake of preventing and controlling indecent invasions by drones, research into many AUDSs has progressively turned into an international research topic of interest [7–9]. There are lots of similar approaches of AUDS, such as spoofing attacks, tracing and early warning, detection, defeating, and signal interfering [10,11]. The scalable effects net warhead invented by the US military launches warheads and throws nets to attack enemy drones, and other anti-UAV technologies developed by scientists. Among them, devices such as radar, satellites, and thermal imaging cameras are used for early warning and detection, which detect the trace of a drone from the ground. Those technologies are to carry out an immediate detection, warning, and tracing of the intruding drone, and provide real-time messages and information support [12–15]. Destroying intruding drones is another kind of approach, which acquires the location of drones first and then defeats them by using missiles, lasers, or other weapons. An example of this type of approach is a defense system using strong energy laser developed by a Turkish company ASELSAN. Although the attack method has the effect of destruction and intimidation, there is a risk of accidental injury to nearby people if the UAV is shot down. Interfering with communications of an intruding drone by electromagnetic interference methods is effective; however, it also affects the surrounding communications. Thus, the shortcoming is clear—the wireless communications around that area is affected. The quality of life will also be influenced. The scalable effects net warhead is a technique for quickly and accurately delivering a net to a remote target using a projectile. It includes a mesh-containing launcher, an ejection spring, and a releasable ogive. This net warhead can be launched from a suitable barrel using a propellant or compressed gas. The speed and rotation of the warhead will be determined by the diameter of the barrel. When the warhead approaches the target, the electronic control board activates the servo motor and then pulls the central locking plunger to release the ogive. Then, the spring pushes the net to aim at and spread on the target resulting in its capture. There are lots of reports of security issues caused by drone incidents. Lots of incidents reported that terrorists around the world use drones to attack the targets. Thus, there are an increasing market and insistent need for AUDS, which lead to the development of AUDS technologies. However, with limited resources, how to develop AUDS detection technologies for civil purposes is a very important research topic. This research proposes a dual-axis rotary tracing mechanism, which adopts a thermal imaging camera and a full-color camera. With an algorithm of image identification, this device traces and locks a drone. The device traces a drone while the drone is in the range of visual detection by tracing of dynamic coordinates. A nine-axis attitude meter and laser rangers in the proposed device are used to perform the sensing then calculate its flight altitude of the drone. The spherical coordinates are used to obtain the longitude and latitude data of the drone. Through continuous locking the dynamic coordinates of the drone, a ground control station arranges an attack drone to defeat the target drone by a net gun, laser gunm or other weapons such as rifles. In this stage, a real-time transmitting of the dynamic coordinate to an attack drone for fast chase is required. This paper proposes a different approach.

2. System Architecture The research structure of this paper consists of three units: The dual-axis mechanism, drone identification, tracing approach, and drone three-dimensional coordinate calculation method. A detailed description of each unit follows.

2.1. The Dual-Axis Rotary Platform Mechanism The design of externals for the proposed dual-axis rotary device in this research is shown in Figure1. There are two axes: The lateral axis is the x-axis (yaw) and the longitudinal axis is the y-axis Appl. Sci. 2019, 9, 2583 3 of 17 Appl. Sci. 2019, 9, x FOR PEER REVIEW 3 of 17

(pitch).(pitch). With the rotation of yaw and pitch, the movementmovement of the image (the picture acquired by the thermal imaging imaging or or full-color full-color camera) camera) will will be be driven. driven. The The image image provides provides flying flying drone drone detection detection with witha viewing a viewing angle angle range range of 360 of° 360for ◦yawfor yawrotation rotation and and180° 180for◦ pitchfor pitch rotation. rotation. In order In order to improve to improve the thedetection detection capability capability of the of drone, the drone, there thereis a mech is aanism mechanism frame frameinside insidethe dual-axis the dual-axis mechanism, mechanism, which whichis equipped is equipped with a fine-tuning with a fine-tuning device (three device sets (three of laser sets rangers), of laser rangers),a nine-axis a nine-axisattitude meter, attitude a thermal meter, aimaging thermal camera, imaging and camera, a full-color and a camera full-color for camera drone forimage drone capture image and capture position and detection. position detection.The fine- Thetuning fine-tuning mechanism mechanism is used to is prov usedide to drone provide tracing, drone and tracing, the laser and theranger laser is rangeradopted is for adopted focusing for focusingprecision precision adjustment. adjustment. In the Indrone the droneimage image focus focus and anddistance distance measurement, measurement, because because the the image acquisition lens and the laser rangers are not on thethe same line, the distance measuring of the drone for high-altitude flightflight needsneeds thethe fine-tuningfine-tuning mechanismmechanism forfor thethe focus.focus. First, the photographic lens is adopted as the focus point of the center of the imageimage frame, then the three laser rangers are set close to the lens,lens, beforebefore thethe fine-tuningfine-tuning mechanism is modifiedmodified with the focus distance of the drone to be acquired by the tracing device, so that the hitting spotspot of the laser ranger on the drone overlaps with the spot of of the the center center of of the the image. image. GPS GPS and and LoRa LoRa antennas antennas are areplaced placed outside outside the thetracing tracing device device due dueto the to need the need to avoid to avoid metal metal disturbance disturbance and and prevent prevent from from signal-shelter signal-shelter in in the the thick thick casing casing of the tracing device.

Infrared Thermal Imaging Camera

Full Color Camera

Laser Ranger Sets

Pitch Rotation

Slip Ring

Yaw Rotation Tracking Platform

Figure 1. Design of the dual-axis mechanism.

2.2. Drone Visual Image Tracing MethodMethod The drone image visual visual tracing tracing process process identifies identifies a a drone drone in in motion motion using using images images acquired acquired by by a athermal thermal imaging imaging camera camera or or full-color full-color camera. camera. These These cameras cameras inside inside a a dual-axis mechanism were adopted forfor image image acquisition acquisition in thisin study.this study. After imageAfter acquired,image acquired, vision image vision processing image processing is performed is andperformed then drone and then tracing drone is done tracing through is done motor through control motor of the control above of platform. the above platform. A dual-axis tracing device adopts image processing techniques toto deal with images of a dynamic object. An imageimage capturecapture card card in in the the device device acquires acquires pictures pictures at aat speed a speed of 30 of frames 30 frames per second,per second, and thenand performsthen performs image processing,image processing, as shown as in shown the image in processingthe image flowchartprocessing in Figureflowchart2. The in acquisitionFigure 2. ofThe a dynamicacquisition drone of a contour dynamic [15 drone,16] is carriedcontour out [15,16] by the is framecarried di ffouterence by the method frame with difference image pre-processing. method with image pre-processing. Appl. Sci. 2019, 9, 2583 4 of 17 Appl. Sci. 2019, 9, x FOR PEER REVIEW 4 of 17

Frame Start Difference

UAV image Detecting UAV N capture moving objects

Y

UAV grayscale Calculating the center point of UAV

UAV image Tracking UAV filtering

Threshold Calculating the amount processing of displacement

Transmitting Dilation the amount of processing displacement

End

FigureFigure 2. 2.Drone Drone image image tracing tracing flowchart. flowchart.

First,First, the the original original image, image, as as shown shown in in Figure Figure3a, 3a, is convertedis converted to grayscaleto grayscale of Sofgray Sgrayby by RGB RGB conversionconversion [15 [15]] as as shown shown in in Figure Figure3b. 3b. The The formula formula is: is:

𝑆Sgray == 0.299𝑅0.299R + 0.587 + 0.587𝐺G + 0.114B + 0.114𝐵 (1)(1) where RGB is the three primary color pixel values of the original image. The grayscale image of where RGB is the three primary color pixel values of the original image. The grayscale image of Gaussian filter and median filter is then used to remove noise and smooth the image to avoid Gaussian filter and median filter is then used to remove noise and smooth the image to avoid misjudging misjudging subsequent images due to noise effect as shown in Figure 3c. Processing Figure 3c as a subsequent images due to noise effect as shown in Figure3c. Processing Figure3c as a histogram can histogram can obtain a histogram of the image. The histogram of the image can be found that there obtain a histogram of the image. The histogram of the image can be found that there is a highest value is a highest value in the histogram. Its value is the T value. After threshold processing, a binary image in the histogram. Its value is the T value. After threshold processing, a binary image can be obtained can be obtained as shown in Figure 3d. as shown in Figure3d. After threshold processing is performed, the method of dilation in morphology can be used to After threshold processing is performed, the method of dilation in morphology can be used to segment independent image element of the UAV so that the UAV is highlighted in the picture for segment independent image element of the UAV so that the UAV is highlighted in the picture for better better tracking of the system. The dilation operation is shown in Equation (2), where A is the original tracking of the system. The dilation operation is shown in Equation (2), where A is the original image image and B is the mask. The core operation is that there is an object at the neighboring point and the and B is the mask. The core operation is that there is an object at the neighboring point and the position position is set as an object. is set as an object.   𝐴A⊕𝐵=B = z𝑧|𝐵Bˆ A∩, 𝐴0 ≠0 (2)(2) ⊕ { | z ∩ } FigureFigure3e 3e shows shows the the image image after after dilation dilation (morphology), (morphology), which which fills fills the the gap gap in in the the broken broken UAV UAV image.image. The The frame frame di differencefference method method allows allows extraction extraction of of a a motion motion region region [17 [17,18],18] on on two two adjacent adjacent imagesimages by by performing performing di differentialfferential operations. operations. It It oversees oversees the the movement movement of of anomalous anomalous objects objects [19 [19]] in in thethe picture picture with with the movementthe movement of the of target the andtarget the an movementd the movement of the camera. of the There camera. will beThere a significant will be a disignificantfference between difference adjacent between frames. adjacent For real-time frames. tracing,For real-time due to tracing, the fast dronedue to flight the fast speed, drone a small flight amountspeed, ofa immediatesmall amount calculation of immediate is needed calculation to trace theis needed drone steadily. to trace Therethe drone is an steadily. advantage There that theis an backgroundadvantage accumulationthat the background problem accumulation for the frame problem difference for method the frame is almostdifference ignored. method The is renew almost speedignored. is quick, The renew and less speed calculation is quick, is and required less calculation for tracing is a required drone immediately, for tracing a which drone isimmediately, shown in Figurewhich3f. is The shown equation in Figure is as 3f. follows: The equation is as follows:

𝐷D((𝑥,x, y)) 𝑦 ==f𝑓+ (x(,𝑥,y) ) 𝑦 fm−(x𝑓,y()𝑥,) 𝑦 (3)(3) m 1 − where 𝐷(𝑥,) 𝑦 is the result of differential operation for the mth frame and (m+1)th frame. Appl. Sci. 2019, 9, 2583 5 of 17

Appl. Sci. 2019, 9, x FOR PEER REVIEW 5 of 17 where D(x, y) is the result of differential operation for the mth frame and (m+1)th frame.

(a) full-color image (c) image filtering (b) grayscale acquisition

(f) frame difference method (d) threshold processing (e) dilation processing processing

FigureFigure 3. 3.Drone Drone image image acquisition acquisition procedures. procedures.

The drone tracing mechanism of a dual-axis tracing device is to convert a two-dimensional variation of the pixels of a captured image (∆x, ∆y) into a step-level control amount of a two-axis step motor inside the above device. A movement amount is calculated by the above conversion of acquired images, which is applied to complete drone tracing operations. Suppose the image’s resolution is set as 2P 2Q pixels, which is shown in Figure4. An acquired screen image’s origin point is set as (x , y ), × 0 0 and its pixel coordinate is (-P, -Q). The screen’s center point is (x1, y1), which is the concentric point in the tracing screen, and its pixel coordinate is defined as (0, 0). Therefore, the pixel of the screen image in the x-axis ranges from -P~+P, and the y-axis ranges from Q~+Q. When the target drone’s image − enters the edge of the rectangular frame of the tracing device, it is acquired by the visual identification of the system. The tracing system instantly estimates the pixel difference (p, q) between the x-axis and the y-axis of the center position (x , y ) and the concentric point (x , y ) of the drone: (x x , y y ). 2 2 1 1 2 − 1 2 − 1 When the p value is positive, it reveals that the center point of the target drone is on the right side of the center point of the screen image, and vice versa for the left side. Furthermore, when the q value is positive, it reveals that the center point of the target drone is on the lower side of the center point of the screen image, and vice versa. Appl. Sci. 2019, 9, x FOR PEER REVIEW 6 of 17

The drone tracing mechanism of a dual-axis tracing device is to convert a two-dimensional variation of the pixels of a captured image (∆x, ∆y) into a step-level control amount of a two-axis step motor inside the above device. A movement amount is calculated by the above conversion of acquired images, which is applied to complete drone tracing operations. Suppose the image’s resolution is set as 2P × 2Q pixels, which is shown in Figure 4. An acquired screen image’s origin point is set as (𝑥,𝑦), and its pixel coordinate is (-P, -Q). The screen’s center point is (𝑥,𝑦), which is the concentric point in the tracing screen, and its pixel coordinate is defined as (0, 0). Therefore, the pixel of the screen image in the x-axis ranges from -P~+P, and the y-axis ranges from −Q~+Q. When the target drone’s image enters the edge of the rectangular frame of the tracing device, it is acquired by the visual identification of the system. The tracing system instantly estimates the pixel difference

(p, q) between the x-axis and the y-axis of the center position (𝑥,𝑦) and the concentric point (𝑥,𝑦) of the drone: (𝑥 −𝑥,𝑦 −𝑦). When the p value is positive, it reveals that the center point of the target drone is on the right side of the center point of the screen image, and vice versa for the left side. Furthermore,Appl. Sci. 2019, 9, when 2583 the q value is positive, it reveals that the center point of the target drone is on6 ofthe 17 lower side of the center point of the screen image, and vice versa.

Figure 4. DiagramDiagram of of the the target target drone displacement vector.

In orderorder to to trace trace the the invading invading drones drones dynamic dyna positionmic position at any time,at any three-dimensional time, three-dimensional coordinate coordinateparameters parameters such as the flightsuch as altitude the flight of the altitude drone andof the its longitudedrone and and its latitudelongitude are and calculated, latitude andare calculated,these parameters and these are immediatelyparameters are transmitted immediately to thetransmitted ground monitoring to the ground center monitoring for direct center attack for or directindirect attack hunt or for indirect the drone. hunt The for control the drone. requirements The control for requirements a two-axis step for motor a two-axis in the step dual-axis motor tracing in the dual-axisdevice are tracing as follows: device are as follows: Suppose the position of of the the concentric concentric point point of of the the screen screen image image is is (𝑥(x1,𝑦, y1)) and the position of the drone is ((𝑥x2,,𝑦y2)),, whichwhich areare shownshown inin FigureFigure4 4.. TheThe systemsystem tracestraces thethe dronedrone inin realreal time,time, andand thethe requirement of motor control speed ofof thethe dual-axisdual-axis devicedevice is:is: 𝑝 𝑞 p q (4) 𝑉Vyaw rotaion== 𝑎𝑛𝑑and Vpitch 𝑉 rotaion = = (4) − ∆𝑡∆t − ∆t ∆𝑡 wherewhere ∆t is the∆t istime the for time the for duration the duration of the image of the capture image capturecard to acquire card to images, acquire which images, is normally which is 30normally frames 30per frames second; per that second; is, ∆t that = 1/30 is, ∆s.t The= 1/ 30smaller s. The the smaller value the of value∆t, the of more∆t, the accurately more accurately the tracing the systemtracing systemcan grasp can the grasp drone the dronemovement movement trajectory, trajectory, so that so the that alignment the alignment screen’s screen’s concentric concentric point point of theof the tracing tracing device device can can lock lock the the drone drone as as soon soon as as possible, possible, but but relatively relatively requires requires a a faster software computing requirement. The system has to convert the pixel difference difference ( p, q) of the image into the tracing step angle of the two-axis step motor of the tracing device; that is, the image coordinates are converted from pixel values (p, q) to the number of steps (Sp, Sq) to drive the two-axis step motor, where Sp and Sq represent values (p, q) to the number of steps (Sp, Sq) to drive the two-axis step motor, where Sp and Sq represent the stepstep numbersnumbers ofof the the step step motor motor for for the the yaw yaw rotation rotation and and pitch pitch rotation, rotation, respectively. respectively. Since Since the stepthe motor rotates 360◦ in one turn, assuming that SR is the total number of steps in one circle of the step motor, the step angle θm of each step motor is:

3600 θm = . (5) SR

Due to the fact that the necessary θm value of the step motor in the tracing device is related to the pixel value acquired by the image capture card, it is important to find the correlation between these two matters. First, suppose the full-color (or thermal imaging) camera acquires the picture in a certain direction as shown in Figure5a. Then, suppose the driver of the step motor in the tracing device send the square wave signal with the number of SP to the step motor of the x-axis, which drives the step motor of the x-axis to carry out the rotation of SP steps. These make the dual-axis device carry out a horizontal right movement, and then acquires the immediate picture, as shown in Figure5b. Finally, by comparing the images of Figure5a,b with the superimposed image and calculating the Appl. Sci. 2019, 9, x FOR PEER REVIEW 7 of 17 step motor rotates 360° in one turn, assuming that SR is the total number of steps in one circle of the step motor, the step angle 𝜃 of each step motor is: 𝜃 = . (5)

Due to the fact that the necessary 𝜃 value of the step motor in the tracing device is related to the pixel value acquired by the image capture card, it is important to find the correlation between these two matters. First, suppose the full-color (or thermal imaging) camera acquires the picture in a certain direction as shown in Figure 5a. Then, suppose the driver of the step motor in the tracing

Appl.device Sci. send2019, the9, 2583 square wave signal with the number of SP to the step motor of the x-axis, which drives7 of 17 the step motor of the x-axis to carry out the rotation of SP steps. These make the dual-axis device carry out a horizontal right movement, and then acquires the immediate picture, as shown in Figure 5b. Finally,increment by ofcomparing the p-value the of images these twoof Figure images 5a,b of thewithx -axisthe superimposed after the step motorimage ofand the calculatingx-axis rotates the incrementsteps of Sp ,of as the shown p-value in Figure of these5c, two the tracingimages systemof the x can-axis receive after the the step number motor of of steps the xp-axisS of the rotates step stepsmotor of necessary Sp, as shown for each in Figure pixel in5c, the the yaw tracing rotation system direction: can receive the number of steps 𝑝 of the step motor necessary for each pixel in the yaw rotation direction: Sp pS = . (6) 𝑝 = p . (6)

Previous image Image after displacement

Sp step of motor

displacement

(a) Two images overlapping (b)

(c)

P p pixel shifting of image corresponding to Sp steps for step motor right shifting Figure 5. SchematicSchematic diagram diagram of the relationship between the step angle and image.

Furthermore, the number of steps of the step motor necessary for each pixel in the pitch rotation direction is q𝑞S..

S𝑆q 𝑞q == (7)(7) S q𝑞

2.3. Drone Three-Dimensional Coordinate Calculation 3. Drone Three-Dimensional Coordinate Calculation (1) Detection method for drone height: The rotation direction and angle judgment mode of the dual-axis(1) Detection tracing device method are for based drone on theheight: value The sensed rotati fromon direction nine-axis and attitude angle meters. judgment Therefore, mode of there the dual-axisis a close relationshiptracing device between are based the on placement the value sensed position from of the nine-axis attitude attitude meter meters. and the Therefore, accuracy ofthere the issystem a close data. relationship The starting between point the of placement orientation positi determinationon of the attitude by an attitudemeter and sensing the accuracy component of the is systembased on data. the presetThe starting position point of its of own orientation module. de Thetermination drone movement by an attitude orientation sensing of the component screen image is basedis based on on the the preset movement position of of the its lens,own somodule. the position The drone of the movement sensing component orientation of of the the attitude screen image meter isshould based be on placed the movement as close as of possible the lens, to so the the camera position lens, of the so thatsensing the angularcomponent deviation of the willattitude be smaller. meter Aftershould many be placed tests, as we close found as thatpossible it has to the the best camera effect lens, when so placedthat the under angular the deviation lens. will be smaller. AfterWhen many thetests, tracing we found device that finds it has that the the best drone effect has when entered placed the under visual the image lens. of the screen as in Figure6, the system immediately traces the drone image in the screen through visual identification, and computes the relative relationship between the drone image and the center point of the visual image screen. Then, it performs an action of target drone tracing and position locking. The drone image enters the center point (x1, y1) of the visual image screen of the tracing device, and the laser ranger can measure when the target drone enters the laser ranging range. The distance value (r) from the target is obtained, and the pitch angle (θ) of the rotating stage at this time is obtained by the nine-axis attitude meter in the apparatus. As shown in Equation (7), the flight altitude h of the drone can be converted. q h = r sinθ = (x x )2 + (y y )2 sinθ (8) × 2 − 1 2 − 1 × Appl. Sci. 2019, 9, x FOR PEER REVIEW 8 of 17

When the tracing device finds that the drone has entered the visual image of the screen as in Figure 6, the system immediately traces the drone image in the screen through visual identification, and computes the relative relationship between the drone image and the center point of the visual image screen. Then, it performs an action of target drone tracing and position locking. The drone image enters the center point (𝑥,𝑦) of the visual image screen of the tracing device, and the laser ranger can measure when the target drone enters the laser ranging range. The distance value (r) from the target is obtained, and the pitch angle (θ) of the rotating stage at this time is obtained by the nine- axis attitude meter in the apparatus. As shown in Equation (7), the flight altitude h of the drone can be converted.

Appl. Sci. 2019, 9, 2583 ℎ = 𝑟 × 𝑠𝑖𝑛θ = (𝑥 −𝑥) +(𝑦 −𝑦) × 𝑠𝑖𝑛𝜃 8(8) of 17

(x2, y2)

Pitch Rotation h (x1, y1) Laser Ranger Sets

Pitch Angle θ

Attitude Indicator

FigureFigure 6. 6. SchematicSchematic diagram diagram of of the the calculation calculation of of the the heig heightht of of the the drone drone on on the the image image concentric concentric point point byby the the attitude attitude meter meter and and the the laser laser ranger. ranger.

(2)(2) Detection Detection method method for for drone drone longitude longitude and and latitude latitude coordinate coordinate as well as well as azimuth: as azimuth: When When the tracingthe tracing system system locks locksthe target the drone target to drone the center to the point center position point position(𝑥,𝑦) of(x 1the, y1 image) of the screen image through screen tracing,through the tracing, target the drone target has drone entered has enteredthe laser the ranging laser ranging range. range.The values The values of the of distance the distance and the and horizontalthe horizontal elevation elevation angle angle of the of target the target drone drone from fromthe test the site test can site be can obtained be obtained by laser by laserrangers rangers and theand attitude the attitude meter meter of theof dual-axis the dual-axis device, device, and then and the then longitude the longitude and latitude, and latitude, and the and azimuth the azimuth angle (relativeangle (relative to the north to the angle), north angle), of the ofdrone the dronecan be can analyzed. be analyzed. In this In paper, this paper, the drone the drone longitude longitude and latitudeand latitude coordinates coordinates [20,21] [20 can,21] be can obtained be obtained by us bying using the target the target coordinate coordinate conversion conversion equation. equation. A schematicA schematic diagram diagram of ofthe the longitude longitude and and latitude latitude coordinate coordinate algorithm algorithm of of the the target target drone drone is is shown shown inin Figure Figure 7.7. This This algorithm algorithm is is applicable applicable to to the the distance calculation between any any two points on the sphericalspherical surface, surface, where where O O is is the the center center point point of of the the sphere, sphere, and and A A and and B B are are any any two two points points on on the the sphere.sphere. Point Point A A represents represents the the long longitudeitude and and latitude latitude coordinates coordinates ( (WA,, JJAA) )obtained obtained by by the the GPS GPS in in the the dual-axisdual-axis tracing tracing platform, platform, and and point point B B represents represents the the unknown unknown longitude longitude and and latitude latitude coordinates coordinates ((WWBB, ,JBJ)B )of of the the drone drone traced traced by by the the dual-axis dual-axis tracing tracing platform. platform. The The longitude longitude and and latitude latitude coordinates coordinates ofof point point B B can can be be obtained obtained by by the the sphere sphere longitud longitudee and and latitude latitude coordinate coordinate conversion conversion [21]. [21]. Other Other parametersparameters are are as as follows: follows: Point G: The true north position of the sphere; point O: Center of the sphere; R: Average radius of the earth (~6371 km); (WA, JA): The values of the longitude and latitude of point A obtained by the GPS of the dual-axis device; ψ (Azimuth angles of point A, B and G): Obtained by the magnetometer and gyroscope of the nine-axis attitude meter in the dual-axis device; ∠a: The angle between the two points B and G and the ground connection GO; ∠b: The angle between the two points A and G and the ground connection GO; ∠c: The angle between the two points A and B and the ground connection GO; ABˆ : The spherical distance between the two points of point A (dual-axis device) and B (drone aerial location); that is, the length of minor arc ABˆ in the arc generated by the intersection of the planes passing through the three points A, O, and B, and the ball. Appl. Sci. 2019, 9, 2583 9 of 17

∠c is the metric angle: ABˆ 1800 ∠c = . (9) R × π Appl. Sci. 2019, 9, x FOR PEER REVIEW 9 of 17

Prime Meridian

θ

(WB, JB ) ψ

(WA, JA) Equator WB L

(0, 0 ) D (0, JB) JB

FigureFigure 7. 7. SchematicSchematic diagram diagram of of the the longitude longitude and and latitude latitude sphere sphere coordinate coordinate algorithm algorithm for for the the drone drone..

PointSince G: the The spherical true north distance position between of the point sphere; A (dual-axis point O: device)Center andof the point sphere; B (drone R: Average aerial location) radius ˆ ofis the much earth smaller (~6371 than km); the length of R, the spherical distance AB between two points can be approximated as the(W linearA, JA): The distance values between of the longitude two points andAB latitude, so Equation of point (9) A can obtained be corrected by the to: GPS of the dual-axis device; AB 1800 ψ (Azimuth angles of point A, B and ∠G):c = Obtained by. the magnetometer and gyroscope of the (10) nine-axis attitude meter in the dual-axis device; R × π ∠∠a:a andThe ∠angleθ can between be converted the two by points the sphere B and longitude G and the and ground latitude connection coordinates 𝐺𝑂 [; 21], and Equation (11)∠ andb: The Equation angle (12)between can bethe obtained two points as follows, A and G and the ground connection 𝐺𝑂; ∠c: The angle between the two points A and B and the ground connection 𝐺𝑂; 1 𝐴𝐵: The spherical distance∠a = cosbetween− (sinA thew twocos∠ cpoints+ cosA ofw pointsin∠ Ac (dual-axiscosψ), device) and B (drone (11) × 𝐴𝐵 × × aerial location); that is, the length of minor arc sinc in thesin ψarc generated by the intersection of the ∠θ = sin 1 × . (12) planes passing through the three points A, O, and− B, andsina the ball. ∠c is the metric angle: Since the drone flight altitude at point B is within a few kilometers above the ground, the length between the two points A and B and the point∠c = O of× the center. of earth can be regarded as the same,(9) and ∠θ can be regarded as the longitude difference between the two points A and B. Thus, the value of Since the spherical distance between point A (dual-axis device) and point B (drone aerial the longitude and latitude of point B (WB, JB) is the longitude and latitude coordinates of drone, where: location) is much smaller than the length of R, the spherical distance 𝐴𝐵 between two points can be approximated as the linear distance betweenJ Btwo= JpointsA + ∠θ .𝐴𝐵, so Equation (9) can be corrected to: (13) ∠c = × . (10) As described above, the longitude and latitude values of the drone flying in the air, which is locked by the dual-axis device and the flight altitude h, can be calculated and acquired, then transmitted ∠a and ∠θ can be converted by the sphere longitude and latitude coordinates [21], and Equation wirelessly to the ground control center for the attack tasks of direct attack or tracing and entrapping. (11) and Equation (12) can be obtained as follows, 3. Experimental Results and Discussion ∠𝑎 = 𝑐𝑜𝑠 (𝑠𝑖𝑛𝐴 × 𝑐𝑜𝑠∠𝑐 +𝑐𝑜𝑠𝐴 × 𝑠𝑖𝑛∠𝑐 )× 𝑐𝑜𝑠ψ , (2) The drone tracing test results of the dual-axis device,× software, and hardware used in the study ∠𝜃 = 𝑠𝑖𝑛 . (3) are as follows: Since the drone flight altitude at point B is within a few kilometers above the ground, the length between the two points A and B and the point O of the center of earth can be regarded as the same, and ∠θ can be regarded as the longitude difference between the two points A and B. Thus, the value of the longitude and latitude of point B (WB, JB) is the longitude and latitude coordinates of drone, where:

𝐽 =𝐽 +∠𝜃. (43) Appl. Sci. 2019, 9, 2583 10 of 17

3.1. Development Environment of the Software and Hardware (1) Visual identification software development: This study uses OpenCV (open-source computer vision library), which is a library application software for computer vision and machine learning. It was originally initiated and developed by the Intel Corporation and distributed under the license of BSD (Berkeley Software Distribution license). It can be used free of charge in business and research. OpenCV includes many image processing functions, machine learning algorithms, and libraries for computer vision applications. (2) Sensors and imaging equipment:

Infrared thermal imaging: As shown in Figure8, this paper uses an intermediate level infrared • thermal imaging camera (TE–EQ1), which is equipped with a 50 mm fixed-focus lens with 384 288 × pixels. It can be used for mobile tracing tests with a drone flying altitude of about 100 m. The TE–EQ1 is capable of observing the distribution of heat sources in drones, providing users with drone tracing tasks through computer vision technology. Full-color single-lens reflex camera: This study uses a Sony α7SII (Japan) with a Sony SEL24240 • lens, as shown in Figure9. The Sony α7SII features ultra-high sensitivity and an ultra-wide dynamic range. It has a 35 mm full-frame 12.2 megapixel image quality, with a wide dynamic range of ISO 50 to 409,600 and a BIONZ X processor, which optimizes the of the sensor, highlights details, and reduces noise, and records 4K (QFHD: 3840*2160) movies in full-frame read-out without image pixel merging, effectively suppressing image edge aliasing and moiré. The main reason for choosing this device and lens is that the Sony α7SII can be transmitted via HDMI, achieving high-quality image instant transmission and less delay in camera transmission than others of a similar price, allowing instant acquisition of images and image processing. Additionally, the Sony SEL24240 telephoto single lens has a focal length of 240 mm, allowing drones at short distances (<100 m) to be clearly displayed on the tracing screen. Thermal image acquisition: The UPG311 UVC image acquisition device is used to read the infrared • thermal imaging images. The infrared thermal imaging output interface is NTSC or PAL. It is compatible with multiple systems and supports plug-and-play and UVC (USB video device class). The agreement supports an image resolution of up to 640 480 pixels. × Frame grabber for full-color camera: We used the Magewell USB Capture HDMI Gen2 capture card, • which supports single-machine simultaneous connection of multiple groups of operations, and is compatible with Windows, Linux, and OS X operating systems with no need to install drivers or a real Plug and Play. It supports a completely standard development interface. In addition, the input and output interfaces use HDMI and USB 3.0, respectively, it supports an input image resolution of up to 2048 2160 and a frame rate of up to 120 fps, and it automatically selects the × aspect ratio that is the most appropriate for the picture. Laser ranger: This article uses the LRF 28-2000 semiconductor laser ranger, as shown in Figure 10. • The main reason for using this laser ranger is that it uses an RF section of 900~908 nm wavelength to protect the human eye, with a range of 3.5 m to as long as 2.0 km. Its measurement resolution and measurement accuracy are 0.1 m and 1.0 m, respectively. Its specifications are suitable for the measurement of the flight altitude of the short-range drones in this study, and its significant distance measurement accuracy can be used as a basis for testing the drone flight altitude. Multirotor: At present, the market share of multirotors is dominated by DJI Dajiang Innovation. • With its drone, which is lightweight, portable, and has a small volume being sold at a friendly price, it is very popular among people generally. Additionally, the news media have reported that most drone events have involved drones from DJI Dajiang Innovation, so this study uses the DJI Mavic Pro multirotor as the main tracing target of the experiment. Appl. Sci. 2019, 9, x FOR PEER REVIEW 10 of 17

As described above, the longitude and latitude values of the drone flying in the air, which is locked by the dual-axis device and the flight altitude h, can be calculated and acquired, then transmitted wirelessly to the ground control center for the attack tasks of direct attack or tracing and entrapping.

3. Experimental Results and Discussion The drone tracing test results of the dual-axis device, software, and hardware used in the study are as follows:

3.1. Development Environment of the Software and Hardware: Appl. Sci. 2019, 9, x FOR PEER REVIEW 11 of 17 (1) Visual identification software development: This study uses OpenCV (open-source computer vision library), which is a library application software for computer vision and machine learning. It was originally initiated and developed by the Intel Corporation and distributed under the license of BSD (Berkeley Software Distribution license). It can be used free of charge in business and research. OpenCV includes many image processing functions, machine learning algorithms, and libraries for computer vision applications. (2) Sensors and imaging equipment: • Infrared thermal imaging: As shown in Figure 8, this paper uses an intermediate level

infrared thermal imaging camera (TE–EQ1) , which is equipped with a 50 mm fixed- focus lens (witha) Body: 384×288 Sony pixels. α7SII It can be used(b) Lens:for mobile Sony tracingSEL24240 tests with a drone flying altitude of about 100 m. The TE–EQ1 is capable of observing the distribution of heat Figure 9. Single-lens reflex camera. Appl. Sci. 2019, 9,sources 2583 in drones, providing users with drone tracing tasks through computer 11vision of 17 technology. • Thermal image acquisition: The UPG311 UVC image acquisition device is used to read the infrared thermal imaging images. The infrared thermal imaging output interface is NTSC or PAL. It is compatible with multiple systems and supports plug-and-play and UVC (USB video device class). The agreement supports an image resolution of up to 640 × 480 pixels. • Frame grabber for full-color camera: We used the Magewell USB Capture HDMI Gen2 capture card, which supports single-machine simultaneous connection of multiple groups of operations, and is compatible with Windows, Linux, and OS X operating Appl. Sci. 2019, 9systems, x FOR PEER with REVIEW no need to install drivers or a real Plug and Play. It supports a completely11 of 17 Figure 8. TE–EQ1 infrared thermal imaging device. standard developmentFigure 8. TE–EQ1interface. infrared In addition thermal, the imaging input device.and output interfaces use HDMI and USB 3.0, respectively, it supports an input image resolution of up to 2048 × 2160 and • Full-color single-lens reflex camera: This study uses a Sony α7SII (Japan) with a Sony a frame rate of up to 120 fps, and it automatically selects the aspect ratio that is the most SEL24240 lens, as shown in Figure 9. The Sony α7SII features ultra-high sensitivity and appropriate for the picture. an ultra-wide dynamic range. It has a 35 mm full-frame 12.2 megapixel image quality, • Laser ranger: This article uses the LRF 28-2000 semiconductor laser ranger, as shown in with a wide dynamic range of ISO 50 to 409,600 and a BIONZ X processor, which Figure 10. The main reason for using this laser ranger is that it uses an RF section of optimizes the performance of the sensor, highlights details, and reduces noise, and 900~908 nm wavelength to protect the human eye, with a range of 3.5 m to as long as 2.0 records 4K (QFHD: 3840*2160) movies in full-frame read-out without image pixel km. Its measurement resolution and measurement accuracy are 0.1 m and 1.0 m, merging, effectively suppressing image edge aliasing and moiré. The main reason for respectively. Its specifications are suitable for the measurement of the flight altitude of choosing this device and lens is that the Sony α7SII can be transmitted via HDMI, the short-range(a) Body: drones Sony in thisα7SII study, and it(sb significant) Lens: Sony distance SEL24240 measurement accuracy achieving high-quality image instant transmission and less delay in camera can be used as a basis for testing the drone flight altitude. transmission than othersFigureFigure of 9. 9.a Single-lenssimilar price, reflexreflex allowing camera.camera. instant acquisition of images and image processing. Additionally, the Sony SEL24240 telephoto single lens has a focal • lengthThermal of image240 mm, acquisition: allowing dronesThe UPG311 at short UVC distances image (<100 acquisition m) to be device clearly is displayedused to read on the tracinginfrared screen. thermal imaging images. The infrared thermal imaging output interface is NTSC or PAL. It is compatible with multiple systems and supports plug-and-play and UVC (USB video device class). The agreement supports an image resolution of up to 640 × 480 pixels. • Frame grabber for full-color camera: We used the Magewell USB Capture HDMI Gen2 capture card, which supports single-machine simultaneous connection of multiple

groups of operations, and is compatible with Windows, Linux, and OS X operating systems with no needFigureFigure to install 10. 10. LaserLaser drivers Ranger Ranger or LRF LRFa real 28-2000B. 28-2000B. Plug and Play. It supports a completely standard development interface. In addition, the input and output interfaces use HDMI 3.2. Development• of Hardware for the Dual-Axis Device Multirotor:and USB 3.0, At respectively, present, the it marketsupports share an input of multirotors image resolution is dominated of up to by2048 DJI × 2160Dajiang and To makeInnovation.a it frame shockproof, rate With of robust,up its to drone, 120 and fps, which waterproof, and itis automatilightweight, the outercally portable, materialselects the and of aspect the has dual-axis a ratiosmall that volume device is the ofbeing most this paper is madesoldappropriate of Teflon.at a friendly It for is equippedthe price, picture. it withis very two popular sets of stepamong motors people (including generally. its stepAdditionally, motor driver). the These two-step• newsLaser motors mediaranger: respectively have This articlereported control uses that the the most LRF up and28-2dron down000e events semiconductor elevation have angleinvolved laser and ranger, drones the movement as fromshown DJI ofin the left and rightDajiangFigure rotation 10.Innovation, The of main the dual-axisso reason this study for device. using uses The thethis DJ pixels laserI Mavic ofranger full-color Pro ismultirotor that image it uses as used thean inmainRF this section tracing device of target of the experiment. are 640 480;900~908 that is, nm P = wavelength640 and Q = to480, protect and the a two-phase human eye, step with motor a rangeSR = of10, 3.5 000 m tois as used; long that as 2.0 is, × θm = 0.036. Thekm. testIts heightmeasurement of drone resolution tracing is and 10~100 measurement m, and the radiusaccuracy of theare dual-axis0.1 m and device 1.0 m, is 3.2. Development of Hardware for the Dual-Axis Device a = 0.25 m. Therespectively. maximum Its flight specifications speed of the are drone suitable is 20for km the/h. measurement As a result, theof the maximum flight altitude rotation of a speedTo of make the dual-axis theit shockproof, short-range device robust, drones is Vyaw and ,inmax this waterproof,= study,V anduav th eit outers0.139 significant (materialm/s). distance of the dual-axis measurement device accuracy of this Rmin × ≈ paperFigure is made 11can ofa shows Teflon.be used theIt as is dual-axis aequipped basis for tracing withtesting two devicethe sets drone of and step flight Figure motors altitude. 11 (includingb shows the its hardwarestep motor structure driver). inside the dual-axis tracing device. Figure 11c shows the focus fine-tuning structure diagram for the laser ranger.

Figure 10. Laser Ranger LRF 28-2000B.

• Multirotor: At present, the market share of multirotors is dominated by DJI Dajiang Innovation. With its drone, which is lightweight, portable, and has a small volume being sold at a friendly price, it is very popular among people generally. Additionally, the news media have reported that most drone events have involved drones from DJI Dajiang Innovation, so this study uses the DJI Mavic Pro multirotor as the main tracing target of the experiment.

3.2. Development of Hardware for the Dual-Axis Device To make it shockproof, robust, and waterproof, the outer material of the dual-axis device of this paper is made of Teflon. It is equipped with two sets of step motors (including its step motor driver). Appl. Sci. 2019, 9, x FOR PEER REVIEW 12 of 17

These two-step motors respectively control the up and down elevation angle and the movement of the left and right rotation of the dual-axis device. The pixels of full-color image used in this device

are 640 × 480; that is, P = 640 and Q = 480, and a two-phase step motor 𝑆 = 10,000 is used; that is, 𝜃 = 0.036. The test height of drone tracing is 10~100 m, and the radius of the dual-axis device is a = 0.25 m. The maximum flight speed of the drone is 20 km/h. As a result, the maximum rotation speed of the dual-axis device is 𝑉, = ×𝑉 ≈ 0.139(𝑚/𝑠). Figure 11a shows the dual-axis tracing device and Figure 11b shows the hardware structure Appl.inside Sci. the2019 dual-axis, 9, 2583 tracing device. Figure 11c shows the focus fine-tuning structure diagram for12 the of 17 laser ranger.

Laser Ranger Sets

Image output contact/ Infrared thermal Control board input imager contact/ System power contact

Full color camera

Camera focus motor

(a) The dual-axis tracing device (b) Appearance of the front view of the image lens/laser ranger

Lens rotation direction

Pitch rotation

Yaw rotation

Screw clockwise Screw anti-clockwise lens pitch down lens pitch up

Screw clockwise lens Screw clockwise lens yaw anti-clockwise yaw clockwise

(c) Picture of the focus fine-tuning configuration of the laser ranger

FigureFigure 11.11. The inner hardware structure of the dual-axis dual-axis tracing tracing device. device.

TheThe mainmain controlcontrol boardboard ofof thethe dual-axisdual-axis tracingtracing device is is shown shown in in Figure Figure 12a 12a and and the the motor motor Appl. Sci. 2019, 9, x FOR PEER REVIEW 13 of 17 controlcontrol unit unit is is shownshown inin FigureFigure 1212b.b.

(a) Main control board (b) Motor control board

Figure 12. 3D3D simulation simulation and and physical physical circuit circuit picture picture of of the the main main control control unit and motor control unit.

3.3. Drone Drone Image Image Identification Identification and Tracin Tracingg Test inin DiDifferentfferent WeatherWeather EnvironmentsEnvironments This paper uses the DJI Mavic Pro four-axis drone to achieve tracing tests of drone flight flight in all kinds of weather. The results of tracing tests on sunny days and at flight flight altitudes of about 70–105 m are shown in Figure 13. At more than 102 m, drone movement cannot be determined (restricted by the resolution of the full-color camera). The test results of various weather conditions––sunny, cloudy, and rainy–– are shown in Figures 14–16. The detailed test description is as follows:

(a) The distance is 72 m (b) The distance is 86 m

(c) The distance is 94 m (d) The distance is 102 m

Figure 13. Tests of drone flight altitudes (on a sunny day).

(1) Sunny environment: As shown in Figure 14, the acquisition effects of the drone’s image on sunny days are different based on different brightness and sunshade levels due to different conditions. In the case of large changes in ambient brightness, the effect of light on the lens and tracing is quite large. In addition, the size of the drone is small, which can cause the phenomenon that the target drone is lost while in image processing, and then recovered. Appl. Sci. 2019, 9, x FOR PEER REVIEW 13 of 17

(a) Main control board (b) Motor control board

Figure 12. 3D simulation and physical circuit picture of the main control unit and motor control unit.

3.3. Drone Image Identification and Tracing Test in Different Weather Environments Appl. Sci. 2019, 9, 2583 13 of 17 This paper uses the DJI Mavic Pro four-axis drone to achieve tracing tests of drone flight in all kinds of weather. The results of tracing tests on sunny days and at flight altitudes of about 70–105 m are shown inin FigureFigure 13 13.. AtAt more more than than 102 102 m, m, drone drone movement movement cannot cannot be be determined determined (restricted (restricted by theby theresolution resolution of the of full-colorthe full-color camera). camera). The test The results test results of various of various weather weather conditions—-sunny, conditions––sunny, cloudy, cloudy,and rainy—- and rainy–– are shown are inshown Figures in Figures 14–16. The14–16. detailed The detailed test description test description is as follows: is as follows:

(a) The distance is 72 m (b) The distance is 86 m

(c) The distance is 94 m (d) The distance is 102 m

Figure 13. TestsTests of drone flight flight altitudes (on a sunny day).

(1) Sunny environment: As As shown shown in in Figure Figure 14,14, the the acquisition effects effects of the drone’s image on sunny daysdays are are di ffdifferenterent based based on dionfferent different brightness brightness and sunshade and sunshade levels due levels to di ffdueerent to conditions. different conditions.In the case ofIn largethe case changes of large in ambient changes brightness, in ambient the brightness, effect of light the oneffect the of lens light and on tracing the lens is quite and large. In addition, the size of the drone is small, which can cause the phenomenon that the target drone Appl.tracing Sci. 2019is quite, 9, x FOR large. PEER In REVIEW addition, the size of the drone is small, which can cause the phenomenon14 of 17 thatis lost the while target in drone image is processing, lost while andin image then processing, recovered. and then recovered.

FigureFigure 14. 14. ResultsResults of of the the tracing tracing test test on on sunny sunny da daysys (drone (drone flight flight altitude altitude is is 105 m).

(2) Cloudy environment: Figure 15 shows the acquisition effects of the drone’s image on cloudy (2) Cloudy environment: Figure 15 shows the acquisition effects of the drone’s image on cloudy days. The cloudy background is clean and free of other interference. In addition, the difference in days. The cloudy background is clean and free of other interference. In addition, the difference in grayscale presented by the sky is not obvious, making the tracing target more significant. The tracing grayscale presented by the sky is not obvious, making the tracing target more significant. The tracing effect was good, and the target drone was not easy to lose. effect was good, and the target drone was not easy to lose.

Figure 15. Results of the tracing test on cloudy days (drone flight altitude is 130 m).

(3) Rainy environment: Figure 16 shows the acquisition effects of the drone’s image on rainy days. In general, no drone appears on rainy days (a normal drone is usually not waterproof); drone players usually choose to take off in clear weather. In the case of rain, the lens is susceptible to raindrops. Coupled with poor vision, it is impossible to clearly present the tracing image, and it is difficult to judge the target, which leads to a poor tracing effect.

Appl. Sci. 2019, 9, x FOR PEER REVIEW 14 of 17 Appl. Sci. 2019, 9, x FOR PEER REVIEW 14 of 17

Figure 14. Results of the tracing test on sunny days (drone flight altitude is 105 m). Figure 14. Results of the tracing test on sunny days (drone flight altitude is 105 m). (2) Cloudy environment: Figure 15 shows the acquisition effects of the drone’s image on cloudy days.(2) The Cloudy cloudy environment: background Figure is clean 15 andshows free the of acqu otherisition interference. effects of In the addition, drone’s imagethe difference on cloudy in days.grayscale The presentedcloudy background by the sky is notclean obvious, and free maki of ngother the interference.tracing target In more addition, significant. the difference The tracing in grayscaleeffectAppl. Sci. was2019 presented good,, 9, 2583 and by the the target sky is drone not obvious, was not makieasy ngto lose.the tracing target more significant. The tracing14 of 17 effect was good, and the target drone was not easy to lose.

Figure 15. Results of the tracing test on cloudy days (drone flight altitude is 130 m). FigureFigure 15. 15. ResultsResults of of the the tracing tracing test test on on cloudy cloudy da daysys (drone (drone flight flight altitude altitude is is 130 130 m). m). (3) Rainy environment: Figure 16 shows the acquisition effects of the drone’s image on rainy days.(3)(3) In Rainy general, environment:environment: no drone appears FigureFigure 16 on16 shows rainyshows days the the acquisition (aacqu normalisition edroneff ectseffects ofis theusuallyof the drone’s drone’s not image waterproof); image on rainy on dronerainy days. days.playersIn general, In usuallygeneral, no drone chooseno drone appears to appearstake on rainyoff onin daysrainyclear (a daysweather. normal (a normal droneIn the isdrone case usually ofis usuallyrain, not waterproof);the not lens waterproof); is susceptible drone players drone to playersraindrops.usually usually choose Coupled tochoose take with otoff poor intake clear vision,off weather.in clearit is impossible weather. In the case In to oftheclearly rain, case thepresent of lensrain, the is the susceptible tracing lens is image, susceptible to raindrops. and it tois raindrops.difficultCoupled to with Coupledjudge poor the with vision,target, poor itwhich is vision, impossible leads it tois impossiblea to poor clearly tracing presentto clearlyeffect. the present tracing the image, tracing and image, it is di andfficult it tois difficultjudge the to target, judge whichthe target, leads which to a poor leads tracing to a poor effect. tracing effect.

Figure 16. Results of the tracing test on rainy days (drone flight altitude is 80 m).

3.4. Test of Drone Tracing and Longitude and Latitude Coordinates Figure 17 shows the tracing process through a full-color image. A target drone was successfully hit with laser rangers in this study. Conversion of the sensed values of the nine-axis attitude meter and the known longitude and latitude coordinates (WA, JA) as (120.328766, 22.648851) of the GPS of the dual-axis device were achieved through the longitude and latitude sphere coordinate algorithm. It can be calculated that when the drone is locked by the tracing system, its current longitude and latitude coordinates (WB, JB) (based on Equations (12) and (13)) are (120.328766, 22.648394), and the flying altitude is 58.7 m. Figure 18 shows the tracing process through a thermal image at the same place. A target drone was successfully hit with the laser rangers of the dual-axis device. Conversion of the sensed values of the nine-axis attitude meter and the known longitude and latitude coordinates (WA, JA), where a longitude and latitude coordinate (120.328766, 22.648851) of the GPS of the device were achieved through the longitude and latitude sphere coordinate algorithm. It can be calculated that when the drone is locked by the tracing system, its current longitude and latitude coordinates (WB, JB) (based on Equations (12) and (13)) are (120.328697, 22.648524) and the flying altitude is 51.9 m. Appl. Sci. 2019, 9, x FOR PEER REVIEW 15 of 17

Figure 16. Results of the tracing test on rainy days (drone flight altitude is 80 m).

3.4. Test of Drone Tracing and Longitude and Latitude Coordinates Figure 17 shows the tracing process through a full-color image. A target drone was successfully hit with laser rangers in this study. Conversion of the sensed values of the nine-axis attitude meter and the known longitude and latitude coordinates (WA, JA) as (120.328766, 22.648851) of the GPS of the dual-axis device were achieved through the longitude and latitude sphere coordinate algorithm. It can be calculated that when the drone is locked by the tracing system, its current longitude and Appl.latitude Sci. 2019 coordinates, 9, 2583 (WB, JB) (based on Equations (12) and (13)) are (120.328766, 22.648394), and the15 of 17 flying altitude is 58.7 m.

Appl. Sci. 2019, 9, x FOR PEER REVIEWFigure 17.17. TestTest results results of of the the full-color full-color tracing tracing interface. interface. 16 of 17

Figure 18 shows the tracing process through a thermal image at the same place. A target drone was successfully hit with the laser rangers of the dual-axis device. Conversion of the sensed values of the nine-axis attitude meter and the known longitude and latitude coordinates (WA, JA), where a longitude and latitude coordinate (120.328766, 22.648851) of the GPS of the device were achieved through the longitude and latitude sphere coordinate algorithm. It can be calculated that when the drone is locked by the tracing system, its current longitude and latitude coordinates (WB, JB) (based on Equations (12) and (13)) are (120.328697, 22.648524) and the flying altitude is 51.9 m.

Figure 18. Test results of thermal imagingimaging tracingtracing interface.interface.

4. ConclusionsConclusion As thethe numbernumber ofof dronesdrones increases,increases, so doesdoes thethe riskrisk ofof flightflight safety.safety. It is onlyonly aa mattermatter ofof timetime before aa drone drone is usedis used to attackto attack aircrafts aircrafts to cause to damage.cause damage. Governments Governments are worried are thatworried the increasing that the numberincreasing of amateurnumber dronesof amateur will lead drones to more will flight lead accidents.to more Therefore,flight accidents. many governmentsTherefore, many now regulategovernments drone now registration, regulate drone pilot’s registration, license regulations, pilot’s license and so regulations, forth, and and also so limit forth, flight and range also limit and maximumflight range flight and altitude. maximum In addition, flight altitude. variousanti-UAV In addition, systems various have beenanti-UAV proposed, systems but thesehave AUDS been eitherproposed, use manualbut these aiming AUDS or either expensive use manual radar foraiming tracing or drones.expensive This radar study for usestracing two drones. sets of stepThis motorsstudy uses with two a thermal sets of imaging step motors camera with/full-color a thermal camera imaging and sensing camera/full-color module for camera the dynamic and sensing tracing ofmodule drone for flying the at dynamic high altitudes, tracing and of measurementdrone flying ofat itshigh three-dimensional altitudes, and measurement coordinates, which of its makesthree- itdimensional a low-cost andcoordinates, practical dronewhich tracingmakes deviceit a low-cost for anti-UAV and practical systems. drone Future tracing research device will for combine anti-UAV an attacksystems. drone Future and research a net gun will with combine a throwing an attack function drone to provide and a net a safe gun anti-UAV with a throwing system with function tracing to andprovide capturing a safe ofanti-UAV drones. system with tracing and capturing of drones. Based on the results of this study in UAV positioning and dynamic locking of the moving coordinates of the UAV, high-energy laser guns or electromagnetic interference guns will be applied for the direct targeting attack in the future work. In addition, real-time transmission of target UAV dynamic coordinate information to an attack UAV will be done for fast pursuit. For the stealth hidden UAV tracking, we will try to search the controller of the UAV remotely by three-point positioning to eliminate the potential threat of the hidden UAV. The future aim of the project is to make the anti- UAV defense system more practical.

Author Contributions: conceptualization, W.-P.C. and B.-H.S.; methodology, W.-P.C. and C.-C.C.; software, C.- I.H.; validation, W.-P.C., B.-H.S. and W.-T.L.; formal analysis, W.-P.C. and B.-H.S.; investigation, C.-I.H.; resources, C.-C.C.; data curation, C.-I.H.; writing—original preparation, B.-H.S.; writing—review and editing, W.-P.C.; visualization, W.-T.L.; supervision, C.-I.H.; project administration, W.-P.C.; funding acquisition, B.-H.S. and C.-C.C.

Funding: This research was funded by the Ministry of Science and Technology and Southern Taiwan Science Park Bureau of Taiwan, grant number MOST 107-2622-E-992 -006 -CC2 and 107B19.

Conflicts of Interest: The authors declare no conflict of interest.

References

1. Boh, W.F.; Lim, W.K.; Zeng, Y. Da Jiang Innovations (DJI): The rise of the drones. In The Asian Business Case Centre-Nanying Technology University Singapore; Nanyang Technological University: Singapore 2017; pp. 1– 28. Appl. Sci. 2019, 9, 2583 16 of 17

Based on the results of this study in UAV positioning and dynamic locking of the moving coordinates of the UAV, high-energy laser guns or electromagnetic interference guns will be applied for the direct targeting attack in the future work. In addition, real-time transmission of target UAV dynamic coordinate information to an attack UAV will be done for fast pursuit. For the stealth hidden UAV tracking, we will try to search the controller of the UAV remotely by three-point positioning to eliminate the potential threat of the hidden UAV. The future aim of the project is to make the anti-UAV defense system more practical.

Author Contributions: Conceptualization, W.-P.C. and B.-H.S.; methodology, W.-P.C. and C.-C.C.; software, C.-I.H.; validation, W.-P.C., B.-H.S. and W.-T.L.; formal analysis, W.-P.C. and B.-H.S.; investigation, C.-I.H.; resources, C.-C.C.; data curation, C.-I.H.; writing—original draft preparation, B.-H.S.; writing—review and editing, W.-P.C.; visualization, W.-T.L.; supervision, C.-I.H.; project administration, W.-P.C.; funding acquisition, B.-H.S. and C.-C.C. Funding: This research was funded by the Ministry of Science and Technology and Southern Taiwan Science Park Bureau of Taiwan, grant number MOST 107-2622-E-992 -006 -CC2 and 107B19. Conflicts of Interest: The authors declare no conflict of interest.

References

1. Boh, W.F.; Lim, W.K.; Zeng, Y. Da Jiang Innovations (DJI): The rise of the drones. In The Asian Business Case Centre-Nanying Technology University Singapore; Nanyang Technological University: Singapore, 2017; pp. 1–28. 2. Chin, K.S.H.; Siu, A.C.Y.; Ying, S.Y.K.; Zhang, Y. Da Jiang Innovation, DJI: The Future of Possible. Acad. Asian Bus. Rev. 2017, 3, 83–109. [CrossRef] 3. Diaz, T.J. Lights, drone... action. IEEE Spectr. 2015, 52, 36–41. [CrossRef] 4. Lort, M.; Aguasca, A.; López-Martínez, C.; Marín, T.M. Initial Evaluation of SAR Capabilities in UAV Multicopter Platforms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 127–140. [CrossRef] 5. Dempsey, P. View from Washington [News Comment]. Eng. Technol. 2015, 10, 14. [CrossRef] 6. Dempsey, P. View from Washington [News Briefing]. Eng. Technol. 2014, 9, 16. [CrossRef] 7. Kratky, M.; Minarik, V. The non-destructive methods of fight against UAVs. In Proceedings of the 2017 International Conference on Military Technologies (ICMT), Brno, Czech Republic, 31 May–2 June 2017; pp. 690–694. 8. Foreign Counter-Unmanned Aerial Systems: Developments in the International Arms Markets. Available online: https://www.ida.org/research-and-publications/publications/all/f/fo/foreign-counter-unmanned- aerial-systems-developments-in-the-international-arms-markets (accessed on 21 May 2019). 9. Ding, G.; Wu, Q.; Zhang, L.; Lin, Y.; Theodoros, T.A.; Yao, Y. An Amateur Drone Surveillance System Based on Cognitive Internet of Things. IEEE Commun. Mag. 2018, 56, 29–35. 10. About the Center for the Study of the Drone. Available online: https://dronecenter.bard.edu/publications/ counter-drone-systems/ (accessed on 21 May 2019). 11. Sebastain, P.; Andrei, L. Considerations Regarding Detection and Combat System for UAV’s. Recent 2017, 18, 49–55. 12. Ramos, D.B.; Loubach, D.S.; da Cunha, A.M. Developing a distributed real-time monitoring system to track UAVs. IEEE Aerosp. Electron. Syst. Mag. 2010, 25, 18–25. [CrossRef] 13. Xu, Z.; Wei, R.; Zhao, X.; Wang, S. Coordinated Standoff Target Tracking Guidance Method for UAVs. IEEE Access 2018, 6, 59853–59859. [CrossRef] 14. Stary, V.; Krivanek, V.; Stefek, A. Optical detection methods for laser guided unmanned devices. J. Commun. Netw. 2018, 20, 464–472. [CrossRef] 15. Sheu, B.H.; Chiu, C.C.; Lu, W.T.; Lien, C.C.; Liu, T.K.; Chen, W.P. Dual-axis Rotary Platform with UAV Image Recognition and Tracking. Microelectron. Reliab. 2019, 95, 8–17. [CrossRef] Appl. Sci. 2019, 9, 2583 17 of 17

16. Honkavaara, E.; Eskelinen, M.A.; Pölönen, I.; Saari, H.; Ojanen, H.; Mannila, R.; Holmlund, C.; Hakala, T.; Litkey, P.; Rosnell, T.; et al. Remote Sensing of 3-D Geometry and Surface Moisture of a Production Area Using Hyperspectral Frame Cameras in Visible to Short-Wave Infrared Spectral Ranges Onboard a Small Unmanned Airborne Vehicle (UAV). IEEE Trans. Geosci. Remote Sens. 2016, 54, 5440–5454. [CrossRef] 17. Yin, H.; Chai, Y.; Yang, S.X.; Yang, X. Fast-moving target tracking based on mean shift and frame-difference methods. J. Syst. Eng. Electron. 2011, 22, 587–592. [CrossRef] 18. Du, B.; Sun, Y.; Cai, S.; Wu, C.; Du, Q. Object Tracking in Satellite Videos by Fusing the Kernel Correlation Filter and the Three-Frame-Difference Algorithm. IEEE Geosci. Remote Sens. Lett. 2018, 15, 168–172. [CrossRef] 19. Chen, P.; Dang, Y.; Liang, R.; Zhu, W.; He, X. Real-Time Object Tracking on a Drone with Multi-Inertial Sensing Data. IEEE Trans. Intell. Transp. Syst. 2018, 19, 131–139. [CrossRef] 20. Chang, X.; Haiyan, X. Notice of Retraction-Using sphere parameters to detect construction quality of spherical buildings. In Proceedings of the 2010 2nd International Conference on Advanced Computer Control, Shenyang, China, 27–29 March 2010; Volume 4, p. 17. 21. Oraizi, H.; Soleimani, H. Optimum pattern synthesis of non-uniform spherical arrays using the Euler rotation. IET Microw. Antennas Propag. 2015, 9, 898–904. [CrossRef]

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).