
Text Generation Using Different Recurrent Neural Networks Thesis submitted in partial fulfillment of the requirements for the award of degree of Master of Engineering in Computer Science and Engineering Submitted by Partiksha Taneja (Roll No. 801532041) Under the supervision of: Dr. Karun Verma Assistant Professor COMPUTER SCIENCE AND ENGINEERING DEPARTMENT THAPAR UNIVERSITY PATIALA - 147004 JULY 2017 CERTIFICATE i ACKNOWLEDGEMENTS ii ABSTRACT Today, Computers have influenced the life of human beings to a great extent. To provide communication between computers and humans, natural language techniques have proven to be very efficient way to exchange the information with less personal requirement. Generative models reduce the need of acquiring laborious labelling for the dataset. Text generation techniques can be applied for improving language models, machine translation, summarization and captioning. Text can be generated by using Hidden Markov Models and Markov Chains but it is difficult to generate whole sentence using them so we have used Recurrent Neural Networks (RNNs) with its variants LSTM and GRU to develop language model that can generate whole new text word by word automatically. Research work presented in this thesis focused on generation of language model by training different RNNs. The proposed method works in two stages. In first stage, training of simple RNN, LSTM and GRU is done on different datasets and in second stage; sampling is done to generate output text. We have considered 5 different input datasets and for each dataset all three networks are trained. Lastly, after this all the output texts are compared to conclude which network generates more realistic text. The variation of training loss with iterations for all datasets is also examined. iii TABLE OF CONTENTS CERTIFICATE .........................................................................................................i ACKNOWLEDGEMENTS ................................................................................... ii ABSTRACT ............................................................................................................ iii LIST OF FIGURES .............................................................................................. vii LIST OF TABLES ................................................................................................. ix ABBREVIATIONS .................................................................................................. x CHAPTER 1 ............................................................................................................. 1 INTRODUCTION .................................................................................................... 1 Foundation ............................................................................................................................. 1 Recurrent Neural Networks ................................................................................................... 2 Backpropagation Through Time (BPTT) ...................................................................................... 3 Vanishing Gradient Problem ......................................................................................................... 3 Applications of Recurrent Neural Networks ......................................................................... 3 Language Modeling and Text Generation .................................................................................... 3 Machine Translation ...................................................................................................................... 4 Speech Recognition ....................................................................................................................... 5 Long Short-Term Memory ................................................................................................... 5 Advantages & Limitations of LSTMs ................................................................................... 8 Advantages .................................................................................................................................... 8 Limitations .................................................................................................................................... 8 Gated Recurrent Unit (GRU) ................................................................................................. 9 Tensorflow ............................................................................................................................. 9 iv Typical workflow ........................................................................................................ 10 Technologies used in Tensorflow ................................................................................ 10 Advantages of Tensorflow ........................................................................................... 11 Limitations of Tensorflow ........................................................................................... 11 Thesis Outline .................................................................................................................... 11 CHAPTER 2 ........................................................................................................... 12 LITERATURE SURVEY ...................................................................................... 12 CHAPTER 3 ........................................................................................................... 17 PROBLEM STATEMENT ................................................................................... 17 Problem Formulation ......................................................................................................... 17 Research Gaps .................................................................................................................... 17 Research Objectives ........................................................................................................... 18 CHAPTER 4 ........................................................................................................... 19 RESEARCH METHODOLOGY ......................................................................... 19 Tensorflow Installation ...................................................................................................... 19 Preprocessing ..................................................................................................................... 20 Tokenize Text .............................................................................................................. 20 Removal of Infrequent words ...................................................................................... 21 Vocabulary & Inverse vocabulary formation .............................................................. 21 Building batches .......................................................................................................... 21 Model Building .................................................................................................................. 21 Initialization ................................................................................................................. 23 Creating Computational graph .................................................................................... 23 Forward Pass ............................................................................................................... 24 Loss Calculation .......................................................................................................... 24 Running Training Session ........................................................................................... 24 v Checkpointing.............................................................................................................. 26 Sampling ............................................................................................................................ 26 Output Generation .............................................................................................................. 27 CHAPTER 5 ........................................................................................................... 28 EXPERIMENTAL RESULTS .............................................................................. 28 Experiment 1 ...................................................................................................................... 28 Experiment 2 ...................................................................................................................... 30 Experiment 3 ...................................................................................................................... 32 Experiment 4 ...................................................................................................................... 34 Experiment 5 ...................................................................................................................... 36 CHAPTER 6 ........................................................................................................... 42 CONCLUSION AND FUTURE SCOPE ............................................................. 42 REFERENCES ....................................................................................................... 43 PLAGIARISM REPORT ...................................................................................... 46 vi LIST OF FIGURES Figure No. Title of Figure Page No. Figure 1.1 Unrolled
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages57 Page
-
File Size-