Neural Networks - Module 1
Total Page:16
File Type:pdf, Size:1020Kb
Neural Networks - Module 1 Hari C.V. Assistant Professor Department of Applied Electronics & Instrumentation Engineering Rajagiri School of Engineering & Technology,DRAFT Kakkanad, Kochi Hari C.V. February 15, 2016 1 / 76 Fundamentals of Neural Networks [1] Neural Networks (NN) are simplified models of the biological nervous sys- tem & therefore have drawn their motivation from the kind of computing performed by a human brain. Also called Artificial Neural Systems (ANS), Artificial Neural Networks (ANN) or simply Neural Networks (NN). Simplified imitations of the central nervous system. An NN, in general is a highly interconnected network by a large number of processing elements are called neurons inDRAFT an architecture inspired by the brain. Hari C.V. February 15, 2016 2 / 76 In machine learning and cognitive science, artificial neural networks (ANNs) are a family of models inspired by biological neural networks (the central nervous systems of animals, in particular the brain) and are used to estimate or approximate functions that can depend on a large number of inputs and are generally unknown [2]. Massively parallel and therefore NNs are said to be exhibit parallel distributed processing. Learn by example Trained with known examples of a problem to ’acquire’ knowledge. Two types of learning Supervised Learning Unsupervised Learning. DRAFT Hari C.V. February 15, 2016 3 / 76 Supervised Learning A ’teacher’ is assumed to be present during the learning process, i.e. the network aims to minimize the error between the target (desired) output presented by the ’teacher’ and the computed output, to achieve better per- formance. Unsupervised Learning There is no ’teacher’ present to hand over the desired output and the net- work therefore tries to learn by itself, organizing the input instances of the problem. Classification Single Layer Feedforward NetworksDRAFT Multi Layer Feedforward Networks Recurrent Networks Hari C.V. February 15, 2016 4 / 76 Well known NN systems Backpropagation Network Perceptron ADALINE (Adaptive Linear Element) Associative Memory Boltzman Machine Adaptive Resonance Theory Self Organizing Map Hopfield Network NN - Applications Pattern Recognition Image Processing Data Compression Forecasting DRAFT Optimization ... etc Hari C.V. February 15, 2016 5 / 76 Human Brain The human brain is the command center for the human nervous system. It receives input from the sensory organs and sends output to the muscles [3]. DRAFT Figure 1: Human Brain Anatomy [4]. Hari C.V. February 15, 2016 6 / 76 Four Different Regions or Lobes Temporal Lobe Frontal Lobe Parietal Lobe Occipital Lobe Sylvian fissure divides the frontal lobe from the temporal lobe. Central sulcus divides the frontal lobeDRAFT from the parietal lobe Hari C.V. February 15, 2016 7 / 76 Different Functions [5] Frontal lobe concerns itself with the primary task of future plans and control of movement. The temporal lobe deals with hearing and is also particularly important for hearing and long term memory storage. The parietal lobe deals with body sensations. The occipital lobe lies at the back of the brain and houses the visual cortex which is concerned with receptionDRAFT and interpretation of vision. Hari C.V. February 15, 2016 8 / 76 DRAFT Figure 2: Human Brain Cross Section [3]. Hari C.V. February 15, 2016 9 / 76 Different Functions [5] Not observable externally, but sitting deep within the cerebrum are many parts such as pons and medulla oblongata which constitutes the brain stem. Medulla control vital automatic functions such as breathing, heart rate and digestion and pons is responsible for conveying information about movement from the cerebrum to the cerebellum. Thalamus is a switching center which processes information from the central nervous system before transmitting it to the central cortex; and the hypothalamus regulates the endocrine system and autonomic function. The corpus callosum is a bundle of fibersDRAFT that go from one hemisphere to the other thereby permitting inter hemisphere communication. Hari C.V. February 15, 2016 10 / 76 Brain contains about 104 basic units of neurons. Each neuron in turn, is connected to about 104 other neurons. A neuron is a small cell that receives electro chemical signals from its various sources and in turn responds by transmitting electrical impulses to other neurons. An average brain weights about 1.5 kg and an average neuron weight of 1.5 × 10−9gms. Some of the neurons perform input and output operations (referred to as afferent and efferent cells respectively), the remaining form a part of an interconnected network of neurons which are responsible for signal transformation and storage of information.DRAFT Hari C.V. February 15, 2016 11 / 76 Structure of a Neuron A neuron, (also known as a neurone or nerve cell) is an electrically excitable cell that processes and transmits information through electrical and chemical signals [6]. DRAFT Figure 3: Structure of a Neuron [7]. Hari C.V. February 15, 2016 12 / 76 A neuron is composed of a nucleus - a cell body known as soma. Attached to soma are long irregularly shaped filaments called dendrites. The dendrites behave as input channels. All inputs from other neurons arrive through the dendrites. Dendrites look like branches of a tree during winter. Another type of link attached to the soma is the Axon. Axon is electrically active and servesDRAFT as an output channel. Hari C.V. February 15, 2016 13 / 76 Axons, which mainly appears on output cells are non linear threshold devices which produces a voltage pulse called Action Potential or Spike that lasts for about a millisecond. If the cumulative inputs received by the soma raise the internal electric potential of the cell known as Membrane Potential, then the neuron ’fires’ by propagating the action potential down the axon to excite or inhibit other neurons. The axon terminates in a specialised contact called synapse or synap- tic junction that connects the axon with the dendrites of another neuron. DRAFT Hari C.V. February 15, 2016 14 / 76 The synaptic junction which is a very minute gap at the end of the dendritic link contains neuro transmitter fluid. This fluid is responsible for accelerating or retarding the electric charges to the soma. Each dendritic link can have many synapses acting on it thus bringing about massive interconnectivity. In general, a single neuron can have many synaptic inputs and synaptic outputs. DRAFT Hari C.V. February 15, 2016 15 / 76 Model of an Artificial Neuron An artificial neuron is a mathematical function conceived as a model of biological neurons. Artificial neurons are the constitutive units in an artificial neural network [8]. Figure 4: Simple Model of anDRAFT Artificial Neuron. x1, x2, x3 . etc are the n inputs to the artificial neuron. w1, w2, w3 . etc are the weights attached to the input links. Hari C.V. February 15, 2016 16 / 76 Biological neuron receives all inputs through the dendrites, sums them and produces an output if the sum is greater than a threshold value. The input signals are passed on to the cell body through the synapse which may accelerate or retard an arriving signal. Acceleration or retardation of the input signals are modelled as weights. An effective synapse which transmits a stronger signal will have a corre- spondingly larger weights while week synapse will have smaller weights. Thus, the weights are multiplicative factors of the inputs to account for the strength of the synapse. DRAFT Hari C.V. February 15, 2016 17 / 76 The total input I received by the soma of the artificial neuron is n I = w1x1 + w2x2 + . wnxn = wi xi (1) i X=1 To generate the final output y, the sum is passed on to non-linear filter φ called Activation Function or Transfer Function or Squash Function which releases the output. y = φ(I ) (2) Activations Functions Thresholding Function Signum Function Sigmoidal Function DRAFT Hyperbolic Tangent Function Hari C.V. February 15, 2016 18 / 76 Thesholding Function Commonly used one. Sum is compared with a threshold value θ and if the value I is greater than θ, then the output is 1 else it is 0. n y = φ wi xi − θ (3) i ! X=1 where, φ is the step function known as Heaviside function and is such that 1, I > 0 φ(I )= DRAFT(4) (0, I ≤ 0 Hari C.V. February 15, 2016 19 / 76 Figure 5: Thresholding Function. Output signal is either 1 or 0 resultingDRAFT in the neuron being on or off. Hari C.V. February 15, 2016 20 / 76 Signum Function Figure 6: Signum Function. Also known as Quantizer function DRAFT +1, I >θ φ(I )= (5) (−1, I ≤ θ Hari C.V. February 15, 2016 21 / 76 Sigmoidal Function Figure 7: SigmoidalDRAFT Function. Hari C.V. February 15, 2016 22 / 76 Sigmoidal function is a continuous function that varies gradually be- tween the asymptotic values 0 and 1 or -1 and +1 and is given by 1 φ(I )= (6) 1+ eαI where α is the slope parameter, which adjusts the abruptness of the function as it changes between two asymptotic values. Sigmoidal functions are differentiable, which is an important feature of NN theory. DRAFT Hari C.V. February 15, 2016 23 / 76 Hyperbolic Tangent Function The function is given by φ(I )= tanh(I ) (7) and can produce negative output values.DRAFT Hari C.V. February 15, 2016 24 / 76 Neural Network Architectures An Artificial Neural Network (ANN) is defined as a data processing system consisting of a large number of simple highly interconnected processing elements (artificial neurons) in an architecture inspired by the structure of cerebral cortex of the brain. ANN structure can be represented using a directed graph. A graph G is an ordered 2-tuple (V,E) consisting of a set V of vertices and a set of E of edges.