Iowa State University Capstones, Theses and Retrospective Theses and Dissertations Dissertations 1996 Adaptive learning methods and their use in flaw classification Sriram Chavali Iowa State University Follow this and additional works at: https://lib.dr.iastate.edu/rtd Part of the Structures and Materials Commons Recommended Citation Chavali, Sriram, "Adaptive learning methods and their use in flaw classification" (1996). Retrospective Theses and Dissertations. 111. https://lib.dr.iastate.edu/rtd/111 This Thesis is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Retrospective Theses and Dissertations by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. r r Adaptive learning methods and their use in flaw classification r by r Sriram Chavali r A thesis submitted to the graduate faculty in partial fulfillment of the requirements for the degree of r MASTER OF SCIENCE r Department: Aerospace Engineering and Engineering Mechanics Major: Aerospace Engineering r1 r Major Professor: Dr. Lester W. Schmerr r r r r r r Iowa State University Ames, Iowa r 1996 r Copyright© Sriram Chavali, 1996. All rights reserved. r( r 11 r Graduate College r Iowa State University rI I This is to certify that the Master's thesis of Sriram Chavali has met the thesis requirements of Iowa State University r r r r r r rt r r r rm I l rl r r ll1 r {l1m'l r 1 TABLE OF CONTENTS {1m 1, I )11m ACKNOWLEDGMENTS Vlll I i ABSTRACT ..... lX r 1 INTRODUCTION 1.1 ND E Flaw Classification . 1 1.2 Background ..... 3 i( 1.3 Scope of the Thesis . 4 r 2 CLUSTERING 5 2.1 Introduction . 5 2.2 Similarity Measures 6 2.3 Hierarchical Clustering . 8 r 2.3.1 Agglomerative hierarchical clustering . 9 2.3.2 Implementation of clustering in CLASS 12 fl11l i 2.4 Results ...................... 13 l 3 BACKPROPAGATION NEURAL NETWORKS . 17 r 3.1 Introduction ......... 17 3.2 Overview of Neural Networks 17 r 3.3 Historical Perspective . 18 3.4 Single Layer Perceptron 19 r 3.5 Multi-Layer Perceptron 23 3.6 Determining a Neuron's Output . 25 r 3. 7 Learning . 29 3.7.1 Adjusting the Weights of the output layer 29 r 3.7.2 Adjusting the Weights of the hidden layers 31 3.8 Momentum ..................... 33 r I r IV r 3.9 Batching ............................. 33 3.10 Advantages and Disadvantages of Backpropagation Learning 35 r 3.11 Implementation of the backpropagation neural network algorithm in CLASS . 35 3.12 Results ........... 37 3.12.1 The parity problem 38 r 4 FEATURE DOMAINS ... 43 4.1 Sampling Discrete Time Signals . 43 4.2 Frequency Domain . 44 4.2.1 Fourier series F' 4.2.2 The discrete Fourier transform 46 I I 4.2.3 Computation of the discrete Fourier transform 48 4.2.4 The Cooley-Tukey FFT algorithm ·• 49 4.3 Cepstrum Analysis ............ =ll rm 4.4 Implementation of Frequency Domain and Cepstral Domain in CLASS 54 lI 5 CONCLUSIONS AND FUTURE WORK 5.) r APPENDIX User Manual for CLASS 57 BIBLIOGRAPHY .............. 62 f1m I ~ It f-1 \ I r r ! r rm l I r r v rt LIST OF TABLES ~ I I Table 2.1 Clustering algorithm results indicating the number of correctly classified samples 15 rmI ! Table 3.1 XOR decision table ............................... 23 ~ \ Table 3.2 Backpropagation neural network training on XOR with a = 0.3 and 1J = 0.7 37 I Table 3.3 Backpropagation neural network training on XOR with a = 0.7 and 1J = 0.3 38 Table 3.4 Backpropagation neural network training on XOR with a = 0.5 and 17 = 0.5 39 Table 3.5 Parity problem of size 3 39 Table 3.6 Parity problem of size 4 40 rl Table 3.7 Backpropagation neural network training on parity problem of size 3 with a = {m!!l I ! 0.3 and 1J = 0.7 . 41 Table 3.8 Backpropagation neural network training on parity problem of size 4 with a = 0.3 and 1J = 0.7 . 41 r ( {11\m vi ) I l rI rm LIST OF FIGURES Figure 1.1 Flow chart for classification 2 Figure 2.1 Clustering of points in Feature space 6 Figure 2.2 The effect of distance threshold on clustering 7 Figure 2.3 A dendogram for hierarchical clustering .... 10 r Figure 2.4 A flowchart of agglomerative hierarchical clustering 11 Figure 2.5 Clustering algorithm folder ............ •. 14 r Figure 2.6 Clustering algorithm report generated by CLASS 16 Figure 3 .1 A graphical representation of an artificial neural network 18 r Figure 3.2 A taxonomy of six neural nets 19 Figure 3.3 A simple neuron ....... 20 Figure 3.4 Linear inseparablilty of the XOR problem 22 Figure 3.5 A three layer perceptron 24 Figure 3.6 A Sigmoid function ... 26 Figure 3. 7 Types of decision regions that can be formed by single- and multi-layer percep­ trons with one and two layers of hidden units and two inputs. Shading denotes decision regions for class A. All the nodes use nonlinear activation functions. 27 Figure 3.8 An artificial neuron with an activation function ................ 28 Figure 3.9 Training a Weight in the output layer. The subscripts p and q refer to a specific i' i neuron. The subscripts j and k refer to a layer. 30 Figure 3.10 Training a Weight in a hidden layer. The subscripts m and prefer to a specific r neuron. The subscripts i , j and k refer to a layer ...... 32 Figure 3.11 Flow chart illustrating the backpropagation training process 34 pml l Figure 3.12 Backpropagation neural network folder .... 36 ( Figure 3.13 BPNN algorithm report generated by CLASS 42 r r Vll Figure 4.1 Aliasing in a discrete time signal . 44 r Figure 4.2 Flow graph of the decimation in time decomposition of an N point DFT into two If point OFT for N = 8. ... 49 Fl I ( Figure 4.3 Flow graph of basic butterfly computation 50 Figure 4.4 Flow graph of Equations 4.28 and 4.29 .. 51 r;:;J I l Figure 4.5 Tree diagram depicting bit reversed sorting . 52 Figure 4.6 Flow graph of 8 point discrete Fourier tt·ansform 53 r Figure A.l Class folders . 58 Figure A.2 Clustering algorithm folder . 59 rL Figure A.3 Backpropagation neural network folder 61 r r r \. F I L r F l r L r r r r r l Vlll F I f9l I ACKNOWLEDGMENTS r I I would like to thank my advisor Dr. Lester Schmerr for his guidance, direction and support of my research. He introduced me to the fields of Nondestructive evaluation and Artificial intelligence and was always there to help me with my numerous questions and doubts. I would also like to thank Dr. Chien-Ping Chiou for being patient with all my questions during the last year. I would also like to thank all the professors who taught me in their classes during the last two years. Finally, I would like to thank my LaTex guru, Dr. John Samsundar. But for John's suggestions and help, I would have had a tough time getting around the idiosyncrasies of LaTex. I would like to thank my officemates, Dr. Terry Lerch, Raju Vuppala and Dr. Alex Sedov. They were always there to help me in doing the small things which are so important. I would like to thank my brother Srikanth, who helped me stay focussed during the last year. I would also like to thank all my friends for their support and help. Finally, I would like to thank my parents for the encouragement, support and ideals that they imparted to me. p;;l I F I ~ I rw' l r i IX ~ I ABSTRACT F" I An important goal of nondestructive evaluation is the detection and classification of flaws in materi­ als. This process of 'flaw classification' involves the transformation of the 'raw' data into other domains, the extraction of features in those domains, and the use of those features in a classification algorithm that determines the class to which the flaw belongs. In this work, we describe a flaw classification software system, CLASS and the updates made to it. Both a hierarchical clustering algorithm and a backpropagation neural network algorithm were implemented -and integrated with CLASS. A fast Fourier transform routine was also added to CLASS in order to enable the use of frequency domain and cepstral domain features. This extended version of CLASS is a very user friendly software, which requires the user to have little knowledge of the actual learning algorithms. CLASS can be easily extended further, if needed, in rwr the future. I r F ! !r 1 r r" I 1 INTRODUCTION rm 1.1 NDE Flaw Classification [ Non-destructive evaluation (NDE) methods place various forms of energy into materials and try to examine the materials without harming them or affecting their performance. Examples of NDE methods and the types of energy they use are: Ultrasound-acoustic energy, Eddy currents (electrical energy), and r X-rays (penetrating radiation). One important application of these NDE methods is to find, classify and characterize flaws based in their response to the types of energy present. Here we are interested r only in the process of classifying flaws which involves, for example, such tasks as distinguishing cracks from non-crack-like flaws.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages75 Page
-
File Size-