sensors Article Computationally Efficient Wildfire Detection Method Using a Deep Convolutional Network Pruned via Fourier Analysis Hongyi Pan * , Diaa Badawi and Ahmet Enis Cetin * Department of Electrical and Computer Engineering, University of Illinois at Chicago, Chicago, IL 60607, USA;
[email protected] * Correspondence:
[email protected] (H.P.);
[email protected] (A.E.C.) Received: 10 April 2020; Accepted: 15 May 2020; Published: 20 May 2020 Abstract: In this paper, we propose a deep convolutional neural network for camera based wildfire detection. We train the neural network via transfer learning and use window based analysis strategy to increase the fire detection rate. To achieve computational efficiency, we calculate frequency response of the kernels in convolutional and dense layers and eliminate those filters with low energy impulse response. Moreover, to reduce the storage for edge devices, we compare the convolutional kernels in Fourier domain and discard similar filters using the cosine similarity measure in the frequency domain. We test the performance of the neural network with a variety of wildfire video clips and the pruned system performs as good as the regular network in daytime wild fire detection, and it also works well on some night wild fire video clips. Keywords: wildfire detection; block-based analysis; transfer learning; Fourier analysis; pruning and slimming 1. Introduction Early wildfire detection is of utmost importance to combat the unprecedented scale of wildfires happening all over the world. Recently, there has been a notable interest in developing real-time algorithms to detect wildfires using regular video-based surveillance systems [1–20]. Video-based forest fire detection can be used to replace traditional point-sensor type detectors because a single pan-tilt-zoom type camera can monitor a wide area, detect forest fire and smoke immediately after the start of the wildfire—as long as the smoke is within the viewing range of the camera.