Deep Learning for Natural Language Processing Develop

Total Page:16

File Type:pdf, Size:1020Kb

Deep Learning for Natural Language Processing Develop Deep Learning for Natural Language Processing Develop Deep Learning Models for Natural Language in Python Jason Brownlee i Disclaimer The information contained within this eBook is strictly for educational purposes. If you wish to apply ideas contained in this eBook, you are taking full responsibility for your actions. The author has made every effort to ensure the accuracy of the information within this book was correct at time of publication. The author does not assume and hereby disclaims any liability to any party for any loss, damage, or disruption caused by errors or omissions, whether such errors or omissions result from accident, negligence, or any other cause. No part of this eBook may be reproduced or transmitted in any form or by any means, electronic or mechanical, recording or by any information storage and retrieval system, without written permission from the author. Acknowledgements Special thanks to my copy editor Sarah Martin and my technical editors Arun Koshy and Andrei Cheremskoy. Copyright Deep Learning for Natural Language Processing © Copyright 2017 Jason Brownlee. All Rights Reserved. Edition: v1.1 Contents Copyright i Contents ii Preface iii I Introductions iv Welcome v Who Is This Book For?...................................v About Your Outcomes................................... vi How to Read This Book.................................. vi About the Book Structure................................. vii About Python Code Examples............................... viii About Further Reading................................... viii About Getting Help.................................... ix Summary.......................................... ix II Foundations1 1 Natural Language Processing2 1.1 Natural Language...................................2 1.2 Challenge of Natural Language...........................3 1.3 From Linguistics to Natural Language Processing.................3 1.4 Natural Language Processing............................5 1.5 Further Reading....................................6 1.6 Summary.......................................7 2 Deep Learning8 2.1 Deep Learning is Large Neural Networks......................8 2.2 Deep Learning is Hierarchical Feature Learning.................. 11 2.3 Deep Learning as Scalable Learning Across Domains............... 12 2.4 Further Reading.................................... 13 2.5 Summary....................................... 14 ii CONTENTS iii 3 Promise of Deep Learning for Natural Language 16 3.1 Promise of Deep Learning.............................. 16 3.2 Promise of Drop-in Replacement Models...................... 17 3.3 Promise of New NLP Models............................. 17 3.4 Promise of Feature Learning............................. 18 3.5 Promise of Continued Improvement......................... 18 3.6 Promise of End-to-End Models........................... 19 3.7 Further Reading.................................... 20 3.8 Summary....................................... 20 4 How to Develop Deep Learning Models With Keras 21 4.1 Keras Model Life-Cycle................................ 21 4.2 Keras Functional Models............................... 26 4.3 Standard Network Models.............................. 27 4.4 Further Reading.................................... 32 4.5 Summary....................................... 33 III Data Preparation 34 5 How to Clean Text Manually and with NLTK 35 5.1 Tutorial Overview................................... 35 5.2 Metamorphosis by Franz Kafka........................... 36 5.3 Text Cleaning Is Task Specific............................ 36 5.4 Manual Tokenization................................. 37 5.5 Tokenization and Cleaning with NLTK....................... 41 5.6 Additional Text Cleaning Considerations...................... 46 5.7 Further Reading.................................... 46 5.8 Summary....................................... 47 6 How to Prepare Text Data with scikit-learn 48 6.1 The Bag-of-Words Model............................... 48 6.2 Word Counts with CountVectorizer ........................ 49 6.3 Word Frequencies with TfidfVectorizer ...................... 50 6.4 Hashing with HashingVectorizer .......................... 51 6.5 Further Reading.................................... 52 6.6 Summary....................................... 53 7 How to Prepare Text Data With Keras 54 7.1 Tutorial Overview................................... 54 7.2 Split Words with text to word sequence ..................... 54 7.3 Encoding with one hot ................................ 55 7.4 Hash Encoding with hashing trick ........................ 56 7.5 Tokenizer API.................................... 57 7.6 Further Reading.................................... 59 7.7 Summary....................................... 59 CONTENTS iv IV Bag-of-Words 61 8 The Bag-of-Words Model 62 8.1 Tutorial Overview................................... 62 8.2 The Problem with Text................................ 62 8.3 What is a Bag-of-Words?............................... 63 8.4 Example of the Bag-of-Words Model........................ 63 8.5 Managing Vocabulary................................. 65 8.6 Scoring Words..................................... 66 8.7 Limitations of Bag-of-Words............................. 67 8.8 Further Reading.................................... 67 8.9 Summary....................................... 68 9 How to Prepare Movie Review Data for Sentiment Analysis 69 9.1 Tutorial Overview................................... 69 9.2 Movie Review Dataset................................ 70 9.3 Load Text Data.................................... 71 9.4 Clean Text Data................................... 73 9.5 Develop Vocabulary.................................. 76 9.6 Save Prepared Data.................................. 80 9.7 Further Reading.................................... 83 9.8 Summary....................................... 84 10 Project: Develop a Neural Bag-of-Words Model for Sentiment Analysis 85 10.1 Tutorial Overview................................... 85 10.2 Movie Review Dataset................................ 86 10.3 Data Preparation................................... 86 10.4 Bag-of-Words Representation............................ 92 10.5 Sentiment Analysis Models.............................. 98 10.6 Comparing Word Scoring Methods......................... 103 10.7 Predicting Sentiment for New Reviews....................... 108 10.8 Extensions....................................... 112 10.9 Further Reading.................................... 112 10.10Summary....................................... 113 V Word Embeddings 114 11 The Word Embedding Model 115 11.1 Overview........................................ 115 11.2 What Are Word Embeddings?............................ 115 11.3 Word Embedding Algorithms............................ 116 11.4 Using Word Embeddings............................... 119 11.5 Further Reading.................................... 120 11.6 Summary....................................... 121 CONTENTS v 12 How to Develop Word Embeddings with Gensim 122 12.1 Tutorial Overview................................... 122 12.2 Word Embeddings.................................. 123 12.3 Gensim Python Library............................... 123 12.4 Develop Word2Vec Embedding........................... 123 12.5 Visualize Word Embedding.............................. 126 12.6 Load Google's Word2Vec Embedding........................ 128 12.7 Load Stanford's GloVe Embedding......................... 129 12.8 Further Reading.................................... 131 12.9 Summary....................................... 132 13 How to Learn and Load Word Embeddings in Keras 133 13.1 Tutorial Overview................................... 133 13.2 Word Embedding................................... 134 13.3 Keras Embedding Layer............................... 134 13.4 Example of Learning an Embedding......................... 135 13.5 Example of Using Pre-Trained GloVe Embedding................. 138 13.6 Tips for Cleaning Text for Word Embedding.................... 142 13.7 Further Reading.................................... 142 13.8 Summary....................................... 143 VI Text Classification 144 14 Neural Models for Document Classification 145 14.1 Overview........................................ 145 14.2 Word Embeddings + CNN = Text Classification.................. 146 14.3 Use a Single Layer CNN Architecture........................ 147 14.4 Dial in CNN Hyperparameters............................ 148 14.5 Consider Character-Level CNNs........................... 150 14.6 Consider Deeper CNNs for Classification...................... 151 14.7 Further Reading.................................... 152 14.8 Summary....................................... 152 15 Project: Develop an Embedding + CNN Model for Sentiment Analysis 153 15.1 Tutorial Overview................................... 153 15.2 Movie Review Dataset................................ 153 15.3 Data Preparation................................... 154 15.4 Train CNN With Embedding Layer......................... 160 15.5 Evaluate Model.................................... 167 15.6 Extensions....................................... 171 15.7 Further Reading.................................... 172 15.8 Summary....................................... 173 CONTENTS vi 16 Project: Develop an n-gram CNN Model for Sentiment Analysis 174 16.1 Tutorial Overview................................... 174 16.2 Movie Review Dataset................................ 174 16.3 Data Preparation..................................
Recommended publications
  • Getting Started with Machine Learning
    Getting Started with Machine Learning CSC131 The Beauty & Joy of Computing Cornell College 600 First Street SW Mount Vernon, Iowa 52314 September 2018 ii Contents 1 Applications: where machine learning is helping1 1.1 Sheldon Branch............................1 1.2 Bram Dedrick.............................4 1.3 Tony Ferenzi.............................5 1.3.1 Benefits of Machine Learning................5 1.4 William Golden............................7 1.4.1 Humans: The Teachers of Technology...........7 1.5 Yuan Hong..............................9 1.6 Easton Jensen............................. 11 1.7 Rodrigo Martinez........................... 13 1.7.1 Machine Learning in Medicine............... 13 1.8 Matt Morrical............................. 15 1.9 Ella Nelson.............................. 16 1.10 Koichi Okazaki............................ 17 1.11 Jakob Orel.............................. 19 1.12 Marcellus Parks............................ 20 1.13 Lydia Sanchez............................. 22 1.14 Tiff Serra-Pichardo.......................... 24 1.15 Austin Stala.............................. 25 1.16 Nicole Trenholm........................... 26 1.17 Maddy Weaver............................ 28 1.18 Peter Weber.............................. 29 iii iv CONTENTS 2 Recommendations: How to learn more about machine learning 31 2.1 Sheldon Branch............................ 31 2.1.1 Course 1: Machine Learning................. 31 2.1.2 Course 2: Robotics: Vision Intelligence and Machine Learn- ing............................... 33 2.1.3 Course
    [Show full text]
  • Read Book Pineapple Express : Fun, Colorful Recipes
    PINEAPPLE EXPRESS : FUN, COLORFUL RECIPES FROM HAWAII - AMAZING PINEAPPLE RECIPES FOR A FRUITFUL LIFE PDF, EPUB, EBOOK Susan Gray | 82 pages | 14 Oct 2020 | Independently Published | 9798697620229 | English | none Pineapple Express : Fun, Colorful Recipes from Hawaii - Amazing Pineapple Recipes for A Fruitful Life PDF Book Free Return Exchange or money back guarantee for all orders Learn more. Annie books friends. Flagging a list will send it to the Goodreads Customer Care team for review. In the s, Old Tom gin, a style with quite a bit more sweetness than London dry, was just beginning to gain popularity in America. Tags: exotic, pineapple, colorful, tropical, summer, pineapple pattern, yellow, green, fresh, pineapple, pineapple, pineapple express, happiness, philippines, cheer, pulp, juice. By Darra Goldstein. How To Stop Emotional Eating. The Double-Decker River hits These are easier than open tarts because the jam doesn't dry out during baking. Blend almond milk, strawberry and pineapple for a smoothie that's so easy you can make it on busy mornings. No matter the time of year, I love taking advantage seasonal fresh fruit when serving signature cocktails at my clients' weddings. By James J. Easy steps to make bite-size pastries topped with pineapple jam. Pineapple Pork Fried Rice. Tropical Overnight Oats Rating: Unrated. Andrea Fazzari. Tags: pineapple, express, pattern, fruit. For over a week California was hit by record-breaking storm after storm. Searching for alcoholic drink recipes? The salsa also works well with chicken and pork. Tags: james franco, comedy, funny, pineapple express, comedy central, roast, this is the end, freaks and geeks, hot, actor, the interview, pop culture.
    [Show full text]
  • Mlops: from Model-Centric to Data-Centric AI
    MLOps: From Model-centric to Data-centric AI Andrew Ng AI system = Code + Data (model/algorithm) Andrew Ng Inspecting steel sheets for defects Examples of defects Baseline system: 76.2% accuracy Target: 90.0% accuracy Andrew Ng Audience poll: Should the team improve the code or the data? Poll results: Andrew Ng Improving the code vs. the data Steel defect Solar Surface detection panel inspection Baseline 76.2% 75.68% 85.05% Model-centric +0% +0.04% +0.00% (76.2%) (75.72%) (85.05%) Data-centric +16.9% +3.06% +0.4% (93.1%) (78.74%) (85.45%) Andrew Ng Data is Food for AI ~1% of AI research? ~99% of AI research? 80% 20% PREP ACTION Source and prepare high quality ingredients Cook a meal Source and prepare high quality data Train a model Andrew Ng Lifecycle of an ML Project Scope Collect Train Deploy in project data model production Define project Define and Training, error Deploy, monitor collect data analysis & iterative and maintain improvement system Andrew Ng Scoping: Speech Recognition Scope Collect Train Deploy in project data model production Define project Decide to work on speech recognition for voice search Andrew Ng Collect Data: Speech Recognition Scope Collect Train Deploy in project data model production Define and collect data “Um, today’s weather” Is the data labeled consistently? “Um… today’s weather” “Today’s weather” Andrew Ng Iguana Detection Example Labeling instruction: Use bounding boxes to indicate the position of iguanas Andrew Ng Making data quality systematic: MLOps • Ask two independent labelers to label a sample of images.
    [Show full text]
  • Deep Learning I: Gradient Descent
    Roadmap Intro, model, cost Gradient descent Deep Learning I: Gradient Descent Hinrich Sch¨utze Center for Information and Language Processing, LMU Munich 2017-07-19 Sch¨utze (LMU Munich): Gradient descent 1 / 40 Roadmap Intro, model, cost Gradient descent Overview 1 Roadmap 2 Intro, model, cost 3 Gradient descent Sch¨utze (LMU Munich): Gradient descent 2 / 40 Roadmap Intro, model, cost Gradient descent Outline 1 Roadmap 2 Intro, model, cost 3 Gradient descent Sch¨utze (LMU Munich): Gradient descent 3 / 40 Roadmap Intro, model, cost Gradient descent word2vec skipgram predict, based on input word, a context word Sch¨utze (LMU Munich): Gradient descent 4 / 40 Roadmap Intro, model, cost Gradient descent word2vec skipgram predict, based on input word, a context word Sch¨utze (LMU Munich): Gradient descent 5 / 40 Roadmap Intro, model, cost Gradient descent word2vec parameter estimation: Historical development vs. presentation in this lecture Mikolov et al. (2013) introduce word2vec, estimating parameters by gradient descent. (today) Still the learning algorithm used by default and in most cases Levy and Goldberg (2014) show near-equivalence to a particular type of matrix factorization. (yesterday) Important because it links two important bodies of research: neural networks and distributional semantics Sch¨utze (LMU Munich): Gradient descent 6 / 40 Roadmap Intro, model, cost Gradient descent Gradient descent (GD) Gradient descent is a learning algorithm. Given: a hypothesis space (or model family) an objective function or cost function a training set Gradient descent (GD) finds a set of parameters, i.e., a member of the hypothesis space (or specified model) that performs well on the objective for the training set.
    [Show full text]
  • Reading in the Dark: Using Film As a Tool in the English Classroom. INSTITUTION National Council of Teachers of English, Urbana, IL
    DOCUMENT RESUME ED 456 446 CS 217 685 AUTHOR Golden, John TITLE Reading in the Dark: Using Film as a Tool in the English Classroom. INSTITUTION National Council of Teachers of English, Urbana, IL. ISBN ISBN-0-8141-3872-1 PUB DATE 2001-00-00 NOTE 199p. AVAILABLE FROM National Council of Teachers of English, 1111 W. Kenyon Road, Urbana, IL 61801-1096 (Stock No. 38721-1659: $19.95, members; $26.95, nonmembers). Tel: 800-369-6283 (Toll Free); Web site http://www.ncte.org. PUB TYPE Books (010) Guides Non-Classroom (055) EDRS PRICE MF01/PC08 Plus Postage. DESCRIPTORS Classroom Techniques; *Critical Viewing; *English Instruction; *Film Study; *Films; High Schools; Instructional Effectiveness; Language Arts; *Reading Strategies; Units of Study IDENTIFIERS *Film Viewing; *Textual Analysis ABSTRACT To believe that students are not using reading and analytical skills when they watch or "read" a movie is to miss the power and complexities of film--and of students' viewing processes. This book encourages teachers to harness students' interest in film to help them engage critically with a range of media, including visual and printed texts. Toward this end, the book provides a practical guide to enabling teachers to ,feel comfortable and confident about using film in new and different ways. It addresses film as a compelling medium in itself by using examples from more than 30 films to explain key terminology and cinematic effects. And it then makes direct links between film and literary study by addressing "reading strategies" (e.g., predicting, responding, questioning, and storyboarding) and key aspects of "textual analysis" (e.g., characterization, point of view, irony, and connections between directorial and authorial choices) .The book concludes with classroom-tested suggestions for putting it all together in teaching units on 11 films ranging from "Elizabeth" to "Crooklyn" to "Smoke Signals." Some other films examined are "E.T.," "Life Is Beautiful," "Rocky," "The Lion King," and "Frankenstein." (Contains 35 figures.
    [Show full text]
  • Keeping the Score the Impact of Recapturing North American Film and Television Sound Recording Work
    Keeping the Score The impact of recapturing North American film and television sound recording work December 2014 [This page is intentionally left blank.] Keeping the Score Table of Contents Acknowledgments 2 Executive Summary 3 Introduction 5 Precarious work in a shifting industry 7 From full-time to freelance 7 A dignified standard set by decades of organizing 9 Musicians in a Twenty-First Century studio system 12 What is a “major studio” anyhow? 12 Composers squeezed in the middle: the rise of the “package deal” 15 Chasing tax credits 17 A profitable industry 19 The “last actors” feel the pain 21 Recording employment slipping away 21 Where has recording gone? 24 Recording the score as “an afterthought” 25 Hollywood provides quality employment – for most 26 Bringing work back: the debate thus far 28 A community weakened by the loss of music 33 Case Study: Impact on the Los Angeles regional economy 33 Impact on the cultural fabric 35 Breaking the social compact 36 Federal subsidies 36 Local subsidies 37 Cheating on employment 38 Lionsgate: a new major roars 39 Reliance on tax incentives 41 Wealth – and work – not shared 41 Taking the high road: what it could mean 43 Conclusion and Recommendations 44 Recommendations to policy makers 44 Recommendations to the industry 46 Endnotes 47 laane: a new economy for all 1 Keeping the Score Acknowledgments Lead author: Jon Zerolnick This report owes much to many organizations and individuals who gave generously of their time and insights. Thanks, first and foremost, to the staff and members of the American Federation of Musicians, including especially Local 47 as well as the player conference the Recording Musicians Association.
    [Show full text]
  • Representations of Masculinity in Contemporary Hollywood Comedies a Thesis Presented to the Facult
    In the Company of Modern Men: Representations of Masculinity in Contemporary Hollywood Comedies A thesis presented to the faculty of the College of Fine Arts of Ohio University In partial fulfillment of the requirements for the degree Master of Arts Nicholas D. Bambach August 2016 © 2016 Nicholas D. Bambach. All Rights Reserved. 2 This thesis titled In the Company of Modern Men: Representations of Masculinity in Contemporary Hollywood Comedies by NICHOLAS D. BAMBACH has been approved for the School of Film and the College of Fine Arts by Ofer Eliaz Assistant Professor of Film Elizabeth Sayrs Interim Dean, College of Fine Arts 3 ABSTRACT BAMBACH, NICHOLAS D., M.A., August 2016, Film In the Company of Modern Men: Representations of Masculinity in Contemporary Hollywood Comedies Director of Thesis: Ofer Eliaz This thesis discusses the increasing visibility of masculine identity in contemporary Hollywood comedies. I examine how shifting developments in economic, societal, cultural, and gender relations impacted the perception of cinematic masculinity. The men, more specifically white and heterosexual, in these films position themselves as victims and, as a result, turn to alternative outlets to ease their frustrations and anxieties. In order to broadly survey the genre of the past two decades, I focus on three consistently popular character tropes in Hollywood comedies: slackers, office workers, and bromantic friendships. All the male characters discussed throughout the thesis are plagued by their innermost anxieties and desires that compromise their gendered identities. However, these films resort to a regressive understanding of masculinity and functions within the dominant heteronormative structures. This thesis demonstrates how Hollywood comedies present a contradictory and multifaceted image of modern masculinity.
    [Show full text]
  • The Apatow Aesthetic: Exploring New Temporalities of Human Development in 21St Century Network Society Michael D
    University of South Florida Scholar Commons Graduate Theses and Dissertations Graduate School 12-7-2016 The Apatow Aesthetic: Exploring New Temporalities of Human Development in 21st Century Network Society Michael D. Rosen University of South Florida, [email protected] Follow this and additional works at: http://scholarcommons.usf.edu/etd Part of the American Studies Commons Scholar Commons Citation Rosen, Michael D., "The Apatow Aesthetic: Exploring New Temporalities of Human Development in 21st Century Network Society" (2016). Graduate Theses and Dissertations. http://scholarcommons.usf.edu/etd/6579 This Thesis is brought to you for free and open access by the Graduate School at Scholar Commons. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Scholar Commons. For more information, please contact [email protected]. The Apatow Aesthetic: Exploring New Temporalities of Human Development in 21st Century Network Society by Michael D. Rosen A thesis submitted in partial fulfillment of the requirements for the degree of Master of Arts in American Studies Department of Humanities and Cultural Studies College of Arts & Sciences University of South Florida Major Professor: Scott Ferguson, Ph.D. Amy Rust, Ph.D. Andrew Berish, Ph.D. Date of Approval: December 11, 2016 Keywords: adulthood, bromance, comedy, homosocial, Judd Apatow, neoliberalism Copyright © 2016, Michael D. Rosen Table of Contents Abstract ..........................................................................................................................................
    [Show full text]
  • Race in Hollywood: Quantifying the Effect of Race on Movie Performance
    Race in Hollywood: Quantifying the Effect of Race on Movie Performance Kaden Lee Brown University 20 December 2014 Abstract I. Introduction This study investigates the effect of a movie’s racial The underrepresentation of minorities in Hollywood composition on three aspects of its performance: ticket films has long been an issue of social discussion and sales, critical reception, and audience satisfaction. Movies discontent. According to the Census Bureau, minorities featuring minority actors are classified as either composed 37.4% of the U.S. population in 2013, up ‘nonwhite films’ or ‘black films,’ with black films defined from 32.6% in 2004.3 Despite this, a study from USC’s as movies featuring predominantly black actors with Media, Diversity, & Social Change Initiative found that white actors playing peripheral roles. After controlling among 600 popular films, only 25.9% of speaking for various production, distribution, and industry factors, characters were from minority groups (Smith, Choueiti the study finds no statistically significant differences & Pieper 2013). Minorities are even more between films starring white and nonwhite leading actors underrepresented in top roles. Only 15.5% of 1,070 in all three aspects of movie performance. In contrast, movies released from 2004-2013 featured a minority black films outperform in estimated ticket sales by actor in the leading role. almost 40% and earn 5-6 more points on Metacritic’s Directors and production studios have often been 100-point Metascore, a composite score of various movie criticized for ‘whitewashing’ major films. In December critics’ reviews. 1 However, the black film factor reduces 2014, director Ridley Scott faced scrutiny for his movie the film’s Internet Movie Database (IMDb) user rating 2 by 0.6 points out of a scale of 10.
    [Show full text]
  • Summer Camp Song Book
    Summer Camp Song Book 05-209-03/2017 TABLE OF CONTENTS Numbers 3 Short Neck Buzzards ..................................................................... 1 18 Wheels .............................................................................................. 2 A A Ram Sam Sam .................................................................................. 2 Ah Ta Ka Ta Nu Va .............................................................................. 3 Alive, Alert, Awake .............................................................................. 3 All You Et-A ........................................................................................... 3 Alligator is My Friend ......................................................................... 4 Aloutte ................................................................................................... 5 Aouettesky ........................................................................................... 5 Animal Fair ........................................................................................... 6 Annabelle ............................................................................................. 6 Ants Go Marching .............................................................................. 6 Around the World ............................................................................... 7 Auntie Monica ..................................................................................... 8 Austrian Went Yodeling .................................................................
    [Show full text]
  • This Is the End and Parafictional Persona
    “Give Me the Seth Rogen Laugh”: This Is the End and Parafictional Persona BRADLEY J. DIXON In the opening scene of This Is the End (2013), Seth Rogen walks through an airport concourse to pick up his old friend, Jay Baruchel, who has just flown in from Canada for a short stay in Los Angeles. After they meet, share a hug, and walk through the airport to begin what Rogen has planned to be “the best weekend ever,” a paparazzo ambushes the pair with a video camera and a spotlight: PAPARAZZO: Hey Seth Rogen, how’s it going man? So, you, like, always play the same guy in every movie. When are you going to do some real acting, man? ROGEN: OK, thank you… PAPARAZZO: Give me something, give me the Seth Rogen laugh. Rogen gives his signature throaty laugh. PAPARAZZO: Seth Rogen, everybody. (Goldberg and Rogen 00:01:40– 00:01:55) From the very first scene, This Is the End wants its audience to know one thing: Rogen and Baruchel are playing themselves. In the film, as in life, the two actors were childhood friends in Canada, came up in television comedy together in the early 2000s, and moved to Los Angeles shortly after finding fame. Baruchel disliked the decadent Hollywood lifestyle and moved back to the comparatively less glamorous life of a working actor in Canada, while Rogen stayed in Los Angeles and became one of the most recognizable faces in comedy. The opening scenes of This Is the End show the pair reconnecting for the first time in over a year.
    [Show full text]
  • Playing with Our Emotions: Genre, Realism and Reflexivity in the Films
    Playing With Our Emotions: Genre, Realism and Reflexivity in the Films of Lars von Trier by Tamar Ditzian A Thesis submitted to the Faculty of Graduate Studies of The University of Manitoba in partial fulfilment of the requirements of the degree of MASTER OF ARTS Department of English, Film and Theatre University of Manitoba Winnipeg Table of Contents Acknowledgements...............................................................................................................i Abstract................................................................................................................................ii Introduction........................................................................................................................1 Chapter I: Mixed Feelings About Dancer in the Dark..................................................13 Chapter II: Dogville as Brechtian Melodrama?............................................................57 Conclusion......................................................................................................................108 Works Cited.....................................................................................................................113 Acknowledgements First and foremost, I wish to extend my deepest gratitude to my advisor, Dr. Brenda Austin-Smith, for her patience and infinite optimism, particularly in the face of those embarrassing early drafts. Her critical questioning and challenging of my ideas and of their expression, from which (thankfully) I had no respite
    [Show full text]