Second Order Neural Networks. Sanjoy Das Louisiana State University and Agricultural & Mechanical College
Total Page:16
File Type:pdf, Size:1020Kb
Louisiana State University LSU Digital Commons LSU Historical Dissertations and Theses Graduate School 1994 Second Order Neural Networks. Sanjoy Das Louisiana State University and Agricultural & Mechanical College Follow this and additional works at: https://digitalcommons.lsu.edu/gradschool_disstheses Recommended Citation Das, Sanjoy, "Second Order Neural Networks." (1994). LSU Historical Dissertations and Theses. 5721. https://digitalcommons.lsu.edu/gradschool_disstheses/5721 This Dissertation is brought to you for free and open access by the Graduate School at LSU Digital Commons. It has been accepted for inclusion in LSU Historical Dissertations and Theses by an authorized administrator of LSU Digital Commons. For more information, please contact [email protected]. INFORMATION TO USERS This manuscript has been reproduced from the microfilm master. UMI films the text directly from the original or copy submitted. Thus, some thesis and dissertation copies are in typewriter face, while others may be from any type of computer printer. The quality of this reproduction is dependent upon the quality of the copy submitted. Broken or indistinct print, colored or poor quality illustrations and photographs, print bleedthrough, substandard margins, and improper alignment can adversely affect reproduction. In the unlikely event that the author did not send UMI a complete manuscript and there are missing pages, these will be noted. Also, if unauthorized copyright material had to be removed, a note will indicate the deletion. Oversize materials (e.g., maps, drawings, charts) are reproduced by sectioning the original, beginning at the upper left-hand corner and continuing from left to right in equal sections with small overlaps. Each original is also photographed in one exposure and is included in reduced form at the back of the book. Photographs included in the original manuscript have been reproduced xerographically in this copy. Higher quality 6" x 9" black and white photographic prints are available for any photographs or illustrations appearing in this copy for an additional charge. Contact UMI directly to order. University Microfilms International A Bell & Howell Information Company 300 North Zeeb Road. Ann Arbor. Ml 48106-1346 USA 313/761-4700 800/521-0600 Order Number 9502107 Second order neural networks Das, Sanjoy, Ph.D. The Louisiana State University and Agricultural and Mechanical Col., 1994 UMI 300 N. ZeebRd. Ann Arbor, MI 48106 SECOND ORDER NEURAL NETWORKS A Dissertation Submitted to the Graduate Faculty of the Louisiana State University and Agricultural and Mechanical College in partial fulfillment of the requirements of tne degree of Doctor of Philosophy in The Department of Electrical and Computer Engineering by Sanjoy Das B. S. Sambalpur University, 1985 May 1994 Acknowledgements I am grateful to Prof. S. C. Kak for his expert guidance and for the support and assistance that he has provided me for the past five years. I am also thankful to my family, especially my wife, Indira for her emotional support for the past two years, my talkative little neice, Soma for cheering me through the telephone, in times of need, and to all my friends, especially Amit, Sun- dar, Bala and Praveen for all their help. Table of Contents Acknowledgements .......................................................................................................... ii A b stra c t.............................................................................................................................. v 1. Introduction ................................................................................................................. 1 1.1 The Biological Basis of Neural Networks ................................................ 1 1.2 Artificial Neural N etw orks ........................................................................... 2 2. First Order Neural Networks .................................................................................. 6 2.1 The Gradient Descent Method ................................................................... 6 2.2 The Hopfield Neural Network .................................................................... 8 2.2.1 Dynamics of the Hopfield Neural Network .............................. 8 2.2.2 Energy of the Hopfield Neural Network ................................... 10 2.3 Other Feedback Neural Networks ............................................................... 13 2.3.1 Stochastic Feedback Neural Networks ...................................... 13 2.3.2 Lagrangian Optimization Neural Networks .............................. 15 2.3.3 Locally Connected Feedback Neural Networks ....................... 17 3. The Polynomial Neural Network .............................................................................. 19 3.1 Attractors of Dynamical System s ............................................................... 19 3.2 Computing the Roots of a Function ........................................................... 21 3.2.1 The Newton-Raphson Method for One Variable ..................... 22 3.2.2 The Multidimensional Newton-Raphson M ethod ................... 24 3.3 The Polynomial Neural Network ................................................................ 24 3.3.1 The Dynamics of the Polynomial Neural Network ............... 25 3.3.2 Computation of the Weight Matrix ............................................ 31 3.3.3 Construction of the Memory Function ..................................... 32 3.3.4 Performance of the Polynomial Neural Network .................... 37 4. Second Order Neural Network for Function Optimization ........................... 48 4.1 Locating Extremum Points of a Function ................................................. 48 4.1.1 The Newton-Raphson M ethod ..................................................... 49 4.2 Neural Networks for Optimization ............................................................. 51 4.2.1 The Basic M odel .......................................................................... 51 4.2.2 The Simplified M odel .................................................................. 54 iii 5. The G rid Connection P ro b le m ............................................................................... 74 5.1 The Grid Connection Problem ................................................................... 74 5.2 A Neural Network Solution to GCP .......................................................... 74 5.3 R esults ............................................................................................................. 76 6. C onclusions................................................................................................................... 81 6.1 D iscussions ..................................................................................................... 81 6.2 Suggestions for Further Research .............................................................. 82 Bibliography..................................................................................................................... 85 V ita ..................................................................................................................................... 89 iv Abstract In this dissertation, a feedback neural network model has been proposed. This network uses a second order method of convergence based on the Newton-Raphson method. This neural network has both discrete as well as continuous versions. When used as an associative memory, the proposed model has been called the polynomial neural network (PNN). The memories of this network can be located anywhere in an n dimensional space rather than being confined to the comers of the latter. A method for storing memories has been proposed. This is a single step method unlike the currently known computationally intensive iterative methods. An energy function for the polynomial neural network has been suggested. Issues relating to the error- correcting ability of this network have been addressed. Additionally, it has been found that the attractor basins of the memories of this network reveal a curious frac tal topology, thereby suggesting a highly complex and often unpredictable nature. The use of the second order neural network as a function optimizer has also been shown. While issues relating to the hardware realization of this network have only been addressed briefly, it has been indicated that such a network would have a large amount of hardware for its realization. This problem can be obviated by using a simplified model that has also been described. The performance of this simplified model is comparable to that of the basic model while requiring much less hardware for its realization. v Chapter 1 Introduction 1.1 Biological Basis of Neural Networks The brain is one the most fascinating structures in nature. It is the seat of memory, emotion and learning. It is also the organ from which all intelligent behavior emerges. The brain is a vastly complicated structure consisting of numerous cells that are called neurons. Investigations have shown that these neurons are essentially two-state cells that can be either firing or quiescent. The neurons are connected to one another by means of dendrites, that emerge from each of these neurons. The neuronal connections themselves are referred to as synapses, and it is by means of these synapses that neu rons exert excitatory or inhibitory influences on one another. The number of neurons in the human brain is estimated to be of the order of 1010, with the number of neuron