
Sci.Int.(Lahore),28(3),2369-2380 ,2016 ISSN 1013-5316; CODEN: SINTE 8 2369 VARIABLE LEARNING RATE BASED MODIFICATION IN BACKPROPAGATION ALGORITHM (MBPA) OF ARTIFICIAL NEURAL NETWORK FOR DATA CLASSIFICATION Qamar Abbas1, Qamar Abbas2, Farooq Ahmad3 and Muhammad Imran4 1,2Computer Department, Iqra University, Islamabad,44000, Pakistan 3MathematicsDepartment, Govt. Islamia College Civil Lines, Lahore, Pakistan 3Presently: MathematicsDepartment, College of Science, Majmaah Universty, Azzulfi, KSA 4Computer Department, Szabist University, Islamabad,44000, Pakistan [email protected], [email protected], [email protected], [email protected] Corresponding author: Dr. Qamar Abbas: [email protected], ABSTRACT:: Learning rate is an important parameter of Backpropagation algorithm (BPA) used to train feed forward artificial neural network. Learning rate has great impact on the training of Backpropagation neural network algorithm. This research work introduces some novel variable schemes of learning rate in BP algorithm. The performances of these new variations are tested over four commonly used and standard benchmark datasets taken from the UCI standard machine learning repository. The algorithm focuses on the classification of non-linear datasets. Most of the researchers have used constant value of learning rate. Change in learning rate changes the training behaviour of BPNN algorithm. Various learning rate schemes of different characteristic introduced in this research work will prove to be valuable addition in BP algorithm. Proposed learning rate schemes are helpful in improving the convergence speed and testing accuracy of BPNN algorithm. Keywords: Backpropagation, Learning Rate, Linear increasing, Linear Decreasing, Choatic, Oscillating, Random 1. INTRODUCTION called nodes. ANN can be used to solve problems require Artificial neural networks (ANN) consist of parallel intelligence or to be used to gain an understanding of processing units having capability of storing experimental biological nervous system without necessarily creating a knowledge and making it available for use. ANN is a new model of a real biological system [2]. The working of form of computing, inspired by biological models. ANN is a artificial network follows the working to human neural branch of artificial intelligence where its architecture is system [3]. Figure 1 show the structure of the human modeled in the form of software having similar workings like nervous system which consists of neurons. human brain [1]. ANN is made up of interconnecting artificial neurons. The interconnected neurons are normally Figure.1 Biological Model of neuron [3] May-June 2370 ISSN 1013-5316; CODEN: SINTE 8 Sci.Int.(Lahore),28(3),2369-2380 ,2016 The dendrites act as the input units of external signals to the Backpropagation algorithm. This algorithm is considered to neuron and the axon acts as the output unit. The soma (cell be the most suitable and efficient algorithm for the multilayer body) sums the incoming signals. Signals sent by other architectures. In case of presenting unusual pair, a small neurons are received by the dendrites. The signals are learning rate is required to use that it can avoid any major transmitted across synaptic gap by means of a chemical disruption [17,18]. Ojhaet. al have applied BPNN algorithm process. Incoming signals are modified by the action of for Detection of proportion of different gas components chemical transmitter. This process is similar to weight present in manhole gas mixture [19]. They have used sensor updating process in ANN. When sufficient input is received array madeup of semiconductor to detect gas components. then the signal is transmitted to other cells over its axon this They have achieved SSE that is below 0.005 in 450 iterations. process is known as fires. It is often supposed that either cell Dai and Liu have introduced the concept of competitive will fire or not. Axons are the data transmission paths. learning in BPNN algorithm for classification data [9]. They Similar summation firing is applied in neural networks [3]. have used the concept of bucket of weight matrices in their Backpropagation algorithm is one of the well-known research work. Then weight matrices buckets are used in the algorithms of artificial neural network. The pattern learning competition to select the optimal set of weights. The extra of ANN is carried out by the weight using in architecture of memory used in the bucket cost more resource utilization in ANN. The main issue with backpropagation algorithm is the this research work. BPNN algorithm is used by Nagathan, slow convergence speed and local optima problem[4,5,6,7,8] Manimozhi andMungara [20] for content based image The updation of these weights is important that includes the recognition system. CBIR system fetches similar types of old value of weigh, the current pattern the network error and images from any database of images. Image features are learning rate. The old value of weight, the current pattern and given as input to train neural network for this application. the network error have fix values while the value of learning The research result shows the 88% precision and 78% recall rate is user specified. Most of the researcher uses constant of image retrieval system. They have used various categories value of leaning rate that may not be less value able for a of images like food, buildings, beach, elephants, buses etc in variety of pattern recognition application. The variable their research work. Khan, Alin and Hussain [21] have used leaning rate will be helpful in escaping from local optima BPNN algorithm for Price Prediction of Share Market. They problem and improving the convergence speed of BP have used BPNN to predict the forecast stock market. algorithm. Although the prediction results of BPNN are closed to actual Learning rate controls the step size of BPNN algorithm [1]. results but few datasets are used to conduct the The small step size results in slow convergence or it can experimentation. A three term backpropagation algorithm is stuck the learning process in local optima problem and big proposed by zweiri et al [4]. They have used learning rate, value of learning rate may skip the optimum value that will momentum term and proportional factor. They have used result in poor learning or overtraining of BPNN algorithm proportional factor for performance improvement of BP [8]. The main objective of this research work is to introduce algorithm. Goyal and Goyal have introduced Cascade BPNN such variable learning schemes that may use the combination algorithm [22] for shelf life detection of processed cheese or some order that uses the small value, big value and values application. They have conducted their research by observing in between small value and big value of learning rate. The the mean square error and other assessment parameters of combination of various learning rate values will prove to be BPNN algorithm for this application. They have used actual helpful in increasing the convergence speed of BPNN sensory score (ASS) and BPNN predicted sensory score algorithm and learning accuracy of BPNN algorithm. (PSS) in their research work. Rubio, Angelov and Pacheco This research work introduces five variable learning rate [23] have used uniform backpropoagation algorithm in their schemes in BPNN algorithm. These variable learning rates research work. They have used a uniform stability theorem will take various values of learning rate from a specific for discrete time system. The proposed variation is helpful in interval of learning rate. The learning rate schemes include online identification and small zone convergence. learning rate controls the step size of BPNN algorithm. The Sapna, Tamilarasi and Kumar have introduced small step size results in slow convergence or it can stuck the Backpropagation Learning Algorithm Based OnLevenberg learning process in local optima problem and big value of Marquardt Algorithm [24]. They used BPNN for predicting learning rate may skip the optimum value that will result in diabetes disease on the data collected from expert persons poor learning or overtraining of BPNN algorithm [9]. The and patients. Borucki, Boczar and Cichoń [25] have main objective of this research work is to introduce such introduced resilient Backpropagation algorithm for signal variable learning schemes that may use the combination or recognition. The research result shows that the some order that uses the small value, big value and values in resilientBackpropagation algorithm is helpful in recognizing between small value and big value of learning rate. The the signal adopted neuron classifier at the level exceeding combination of various learning rate values will prove to be 90%. Reynaldi, Lukas and Margaretha [26] have introduced helpful in increasing the convergence speed of BPNN finite element based neural network. They have used BPNN algorithm and learning accuracy of BPNN algorithm. for differential equation and inverse problem of differential 2. Literature Review of Back Propagation Algorithm equation. BPNN algorithm successfully solves inverse matrix Backpropagation algorithm is one the most commonly used calculation for solving both differential equation and inverse algorithm of artificial neural network [10,11,12,13]. differential problem. Audhkhasi, Osoba and Kosko [27] have Backpropagation is applied to solve variety of problems introduced the concept of noise in BP algorithm for [14,15,16]. The algorithm that is used in this research is the convergence speed improvement. They have added the noise May-June Sci.Int.(Lahore),28(3),2369-2380 ,2016 ISSN 1013-5316; CODEN: SINTE 8 2371 in training data of BP algorithm. They have used MNIST Step6: In this step each output unit receives a target pattern digit classification application to assess the performance of corresponding to the input training pattern. The formula is BP algorithm. Koscak, Jaksa, and Sincak [28] have used BPNN algorithm for prediction of temperature daily profile. k ()(_) t k y k f y in k ..........(3) They have used stochastic weight update mechanism in their research work. They used the memory of previous stored step and then computes its error information terms.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages10 Page
-
File Size-