
Sådhanå (2018) 43:103 Ó Indian Academy of Sciences https://doi.org/10.1007/s12046-018-0888-9Sadhana(0123456789().,-volV)FT3](0123456789().,-volV) Single-solution Simulated Kalman Filter algorithm for global optimisation problems NOR HIDAYATI ABDUL AZIZ1,2,* , ZUWAIRIE IBRAHIM3, NOR AZLINA AB AZIZ1, MOHD SABERI MOHAMAD4 and JUNZO WATADA5 1 Faculty of Engineering and Technology, Multimedia University, 75450 Bukit Beruang, Melaka, Malaysia 2 Faculty of Electrical and Electronics Engineering, Universiti Malaysia Pahang, 26600 Pekan, Pahang, Malaysia 3 Faculty of Manufacturing Engineering, Universiti Malaysia Pahang, 26600 Pekan, Pahang, Malaysia 4 Faculty of Bioengineering and Technology, Universiti Malaysia Kelantan, Jeli Campus, Lock Bag 100, 17600 Jeli, Kelantan, Malaysia 5 Department of Computer and Information Sciences, Universiti Teknologi PETRONAS, Seri Iskandar, 32610 Teronoh, Perak, Malaysia e-mail: [email protected] MS received 26 January 2017; revised 28 September 2017; accepted 6 October 2017; published online 18 June 2018 Abstract. This paper introduces single-solution Simulated Kalman Filter (ssSKF), a new single-agent opti- misation algorithm inspired by Kalman Filter, for solving real-valued numerical optimisation problems. In comparison, the proposed ssSKF algorithm supersedes the original population-based Simulated Kalman Filter (SKF) algorithm by operating with only a single agent, and having less parameters to be tuned. In the proposed ssSKF algorithm, the initialisation parameters are not constants, but they are produced by random numbers taken from a normal distribution in the range of [0, 1], thus excluding them from tuning requirement. In order to balance between the exploration and exploitation in ssSKF, the proposed algorithm uses an adaptive neigh- bourhood mechanism during its prediction step. The proposed ssSKF algorithm is tested using the 30 benchmark functions of CEC 2014, and its performance is compared to that of the original SKF algorithm, Black Hole (BH) algorithm, Particle Swarm Optimisation (PSO) algorithm, Grey Wolf Optimiser (GWO) algorithm and Genetic Algorithm (GA). The results show that the proposed ssSKF algorithm is a promising approach and able to outperform GWO and GA algorithms, significantly. Keywords. Single-solution; adaptive neighbourhood; SKF; Kalman; optimisation; metaheuristics. 1. Introduction (NFL) theorems for optimisation’’ by Wolpert and Macready in 1997 [5]. The NFL theorems suggest that on average, all Heuristic optimisation method is becoming more relevant optimisation algorithms perform equally when all types of in today’s world. The complexity of many real-world optimisation problems are taken into consideration. How- optimisation problems has turned scientists and engineers ever, instead of hampering the field, these NFL theorems to heuristic methods to solve their problems, where opti- have inspired researchers to keep on improving and propos- mality is being traded off for near-optimal solutions that ing new optimisation algorithms in search for the best opti- can be achieved within reasonable computational time. misation algorithm that works for most problems, even if not Metaheuristic optimisation algorithms characterise a for all. Thus, several new global optimisation algorithms, group of general-purpose heuristic optimisation methods that mostly nature-inspired, have been developed over the last are governed by a higher-level strategy that leads the search few decades [6–8]. The Black Hole (BH) [9], Grey Wolf [1]. Derivation of mathematical models of the optimisation Optimiser (GWO) [10] and Simulated Kalman Filter (SKF) problems is not required when using metaheuristics methods [11] are a few examples of recently proposed metaheuristics as the problems are treated like black boxes [2]. Genetic algorithms. Algorithm (GA) [3] and Particle Swarm Optimisation (PSO) Based on the number of agents used, a metaheuristic [4] are some well-known examples of metaheuristics algo- algorithm can be classified into either single-solution-based rithms. The search for the best global optimisation algorithm metaheuristics or population-based metaheuristics. Single- still continues despite the introduction of ‘‘No Free Lunch solution-based metaheuristics make use of only a single agent, improved from one iteration to another. Simulated *For correspondence 1 103 Page 2 of 15 Sådhanå (2018) 43:103 Annealing (SA) [12], Tabu Search (TS) [13] and Variable algorithms that have many setting parameters. Parameters in Neighbourhood Search (VNS) [14] are examples of algo- GA include the probability of mutation, probability of rithms in this category. Population-based metaheuristics crossover and the selection procedure. PSO, on the other adopt a number of agents to explore the search space in hand, despite being easy to be understood, has three param- order to solve an optimisation problem. Besides GA, PSO, eters to be tuned. Some classical algorithms, such as TS and BH, GWO and SKF, examples of population-based meta- SA, have at least one or two parameters that require tuning. heuristics include Ant Colony Optimisation (ACO) [15], Usage of such algorithms requires some preliminary tuning Firefly (FA) [16] and Cuckoo Search (CS) [17]. Population- computation of the parameters before it can be applied to based algorithms are said to perform better because they solve an optimisation problem. One alternative is to offer employ a number of agents (normally many) that shares some default values for the parameters. Covariance Matrix information about the search space to avoid local optima Adaptation Evolution Strategy (CMA-ES) [30] is an example stagnation [18]. Due to this strong point, many population- of algorithms that offer some default parameter values to the based algorithms are employed to solve challenging opti- users. These values are claimed to be applicable to any misation problems. optimisation problems in hand. Self-tuning parameters, like Previously, the original SKF algorithm was introduced as a what has been introduced in Differential Evolution (DE) population-based metaheuristic algorithm. It has been used to [31], is another alternative solution. Ultimately, parameter- solve various types of benchmark optimisation problems free algorithms such as BH and Symbiotic Organisms Search [19]. SKF makes use of a population of agents that operates (SOS) [32] are desirable. using a standard Kalman Filter framework to solve optimi- Further investigations on the effectiveness of Kalman sation problems. Each agent in SKF makes estimation of the Filter estimation capability as a source of inspiration for optimum based on a simulated measurement process that is metaheuristics optimisation algorithms suggest that it can guided by a best-so-far solution. Kalman Filter, named after be realised using only a single Kalman Filter estimator. its founder, is a renowned state estimation algorithm based on Thus, in this paper, a single agent version of the SKF the minimum mean square error method [20]. Kalman Filter algorithm is proposed to solve single-objective, real is considered as an optimal estimator for a linear system, parameter optimisation problems. The proposed algorithm, especially when the noises are Gaussian in nature. While named as single-solution Simulated Kalman Filter (ssSKF), multiple sequential measurements are normally required to requires only one single agent. This agent iteratively come out with a good estimation of a system’s state, Kalman improves its estimation according to the standard Kalman Filter requires only the last estimated state and a new mea- Filter framework with the help of adaptive neighbourhood surement to come out with a better estimation. The capability method during its prediction step. The problem of param- of the Kalman Filter to make a good estimation, supported by eter tuning is reduced by adopting normally distributed the information sharing between agents, makes SKF a good random numbers for all three parameters P(0), Q and R global optimiser and a competitive algorithm compared with whenever they are needed [33, 34]. The normally dis- existing metaheuristic algorithms. Ever since its introduc- tributed random numbers are scaled and shifted so that they tion, SKF has undergone various adaptations and many lie in the range of 0 to 1, defined by applications. They include extensions of the SKF algorithm Nðl; r2Þ¼Nð0:5; 0:1Þ. However, a new parameter a is to deal with combinatorial optimisation problems [21–23], introduced, but a fixed value is suggested. In this study, the and hybridisation of the SKF algorithm with PSO algorithm ssSKF algorithm is tested using all 30 benchmark functions [24] and GSA algorithm [25]. The population-based SKF of CEC 2014 benchmark suite [35], and is compared algorithm has been applied to find the optimal path of a against some existing metaheuristic optimisation algo- 14-hole drill path optimisation problem [26] and Airport Gate rithms, including the state-of-art PSO and GA. Allocation Problems (AGAP) [27]. The discrete type of the The remaining part of the paper is organised as follows. SKF algorithm has been subjected to resolving the AGAP Section 2 gives a brief description of the Kalman Filter [28] and to solve feature selection problem for EEG peak framework. In section 3, a detailed description of the pro- detection [29]. posed single-solution-based SKF algorithm is explained. Despite its good performance, SKF is not a parameter-less Section 4 describes the experimental set-up. Next, the algorithm. In the
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages15 Page
-
File Size-