Unsupervised Text Recap Extraction for TV Series Hongliang Yu and Shikun Zhang and Louis-Philippe Morency Language Technologies Institute Carnegie Mellon University Pittsburgh, PA, 15213, USA yuhongliang, shikunz, morency @cs.cmu.edu { } Abstract absorb the essence of previous episodes, but also grab people’s attention with upcoming plots. How- Sequences found at the beginning of TV ever, creating those recaps for every newly aired shows help the audience absorb the essence episode is labor-intensive and time-consuming. To of previous episodes, and grab their attention with upcoming plots. In this paper, we pro- our advantage, there are many textual scripts freely pose a novel task, text recap extraction. Com- available online which describe the events and ac- 2 pared with conventional summarization, text tions happening during the TV show episodes . recap extraction captures the duality of sum- These textual scripts contain plot descriptions of the marization and plot contingency between ad- events, dialogues of the actors, and sometimes also jacent episodes. We present a new dataset, the synopsis summarizing the whole episode. TVRecap, for text recap extraction on TV These abundant textual resources enable us to shows. We propose an unsupervised model that identifies text recaps based on plot de- study a novel, yet challenging task: automatic scriptions. We introduce two contingency fac- text recap extraction, illustrated in Figure 1. The tors, concept coverage and sparse reconstruc- goal of text recap extraction is to identify seg- tion, that encourage recaps to prompt the up- ments from scripts which both summarize the cur- coming story development. We also propose a rent episode and prompt the story development of multi-view extension of our model which can the next episode. This unique task brings new incorporate dialogues and synopses. We con- technical challenges as it goes beyond summariz- duct extensive experiments on TVRecap, and ing prior TV episodes, by introducing a concept of conclude that our model outperforms summa- rization approaches. plot contingency to the upcoming TV episode. It differs from conventional summarization techniques which do not consider the interconnectivity between 1 Introduction neighboring episodes. Text recaps should capture According to a study by FX Networks, in U.S., the the duality of summarization and plot contingency total number of ongoing scripted TV series hit a new between neighboring episodes. To our knowledge, high of 409 on broadcast, cable, and streaming in no dataset exists to study this research topic. 20151. Such a large number indicates there are more In this paper, we present an unsupervised model shows than anyone can realistically watch. To attract to automatically extrapolate text recaps of TV shows prospective audiences as well as help current view- from plot descriptions. Since we assume recaps ers recall the key plot when airing new episodes, should cover the main plot of the current episode some TV shows add a clip montage, which is called and also prompt the story development of the next a recap sequence, at the beginning of new episodes episode, our model jointly optimizes these two ob- or seasons. Recaps not only help the audience 2http://www.simplyscripts.com/tv_all. 1http://tinyurl.com/jugyyu2 html 1797 Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, pages 1797–1806, Austin, Texas, November 1-5, 2016. c 2016 Association for Computational Linguistics Current Episode Next Episode We see Shannon and Sayid working on the translation. Sayid finds Boone is watching Shannon read from far away. Sayid shows up, and the translation nonsense, slightly annoyed. Shannon walks off, upset hands a box to Shannon to thank for her help with the translation. and frustrated with Sayid and herself. Back to the robbery. Hutton Shannon opens the box which contains purple flowery shoes. They opens the door as Jason is pointing gun at him. continue talking as the shot switches to Boone watching them. Kate shoots Joson in the leg. Kate opens the box which reveals an Flashback - Shot of Boone with his arm around a girl, carrying tennis envelope inside. Text Recap Extraction racket and ball, walking up steps from the tennis court to the pool area On-Island – Jack is with the case asking Kate to tell him what is inside. of a club. Sound of a cell phone ringing. Shannon is in a shaky voice. Jack opens the box, and finds an envelope. Kate opens the envelope Shannon is yelling at someone on her end. and pulls out a small airplane. After admitting it belongs to the man On-Island - Shot of Sayid limping along the beach. Kate loved and killed, Kate sits down and starts crying. Jack looks Boone confronts Sayid and tells him to stay away from his sister nonplussed, he closes up the case and walks away. We see Shannon and Sayid working Shannon. Locke calls Boone away. Boone and Locke walk off into the Shot of everyone moving up the beach. Rose sitting by a tree, Charlie on the translation. jungle. approaches. Shot of Shannon walking up to Sayid on the beach. Kate opens the box which reveals Boone stares at Sayid and Shannon from behind a tree with a an envelope inside. … … weird look on his face. Kate just stares at her toy airplane. After admitting it belongs to the man Kate loved and killed, Kate sits down and starts crying. Boone stares at Sayid and Shannon from behind a tree with a weird look on his face. Text Recap Figure 1: Illustration of text recap extraction. The system extracts sentences from the current episode. The text recap sentences in black summarize the current episode, while colored sentences motivate the next episode. jectives. To summarize the current episode, our evaluation metrics of text summarization. Finally, model exploits coverage-based summarization tech- we discuss the video description which is comple- niques. To connect to the next episode, we devise mentary to our work. two types of plot contingency factors between adja- cent episodes. These factors implement the coverage Generic Text Summarization Alogrithms Text and reconstruction assumptions to the next episode. summarization is widely explored in the news do- We also show how our model can be extended to in- main (Hong and Nenkova, 2014; McKeown, 2005). tegrate dialogues and synopses when available. Generally, there are two approaches: extractive and We introduce a new dataset3, named TVRecap abstractive summarization. for text recap extraction which consists of TV se- Extractive summarization forms a summary by ries with textual scripts, including descriptions, di- choosing the most representative sentences from the alogues and synopses. The dataset enables us to original corpus. The early system LEAD (Was- study whether contingency-based methods which son, 1998) was pioneering work. It selected lead- exploit relationships between adjacent episodes can ing text of the document as the summary, and was improve summarization-based methods. applied in news searching to help online customers The rest of this paper is organized as follows. In focus their queries on the beginning of news docu- Section 2, we discuss related work and the motiva- ments. He et al. (2012) assumed that summarization tion for our work. In Section 3, we introduce our should consist of sentences that could best recon- new dataset for text recap extraction. Section 4 ex- struct the original document. They modeled rela- plains our proposed model for text recap extraction, tionship among sentences by forming an optimiza- and Section 5 expands the model by incorporating tion problem. Moreover, Sipos et al. (2012) and synopses and dialogues. In Section 6 and 7, we Lin and Bilmes (2010) studied multi-document sum- present our experimental results and analyses, and marization using coverage-based methods. Among finally conclude our work in Section 8. them, Lin and Bilmes (2010) proposed to approxi- mate the optimal solution of a class of functions by 2 Related Work exploiting submodularity. Abstractive summarization automatically create In this section, we discuss three related research top- new sentences. For example, compared with the ics. Text summarization is an relevant task that aims sentence-level analysis in extractive summarization, to create a summary that retains the most important Bing et al. (2015) explored fine-grained syntactic points of the original document. Then we discuss the units, i.e. noun/verb phrases, to represent concepts 3http://multicomp.cs.cmu.edu in input documents. The informative phrases were 1798 then used to generate sentences. 3 The TVRecap Dataset In this paper, we generalize the idea of text sum- We collected a new dataset, called TVRecap, for text marization to text recap extraction. Instead of sum- recap extraction on TV series. We gathered and marizing a given document or collection, our model processed scripts, subtitles and synopses from web- emphasizes plot contingency with the next episode. sites4 as components to build our model upon. We also established ground truth to help future research Summarization Applications Summarization on this challenging topic. TVRecap includes all sea- techniques are not restricted to informative re- sons from the widely-known show “Lost” with a sources (e.g. news), applications in broader areas total of 106 episodes. Statistics of our dataset are are gaining attention (Apar´ıcio et al., 2016). As shown in Table 1. the prevailance of online forums, Misra et al. (2015) developed tools to recognize arguments # sent. avg. # sent. # words avg. # w./s. from opinionated conversations, and group them description 14,686 138.5 140,684 9.57 across discussions. In entertainment industry, Sang dialogue 37,714 355.8 284,514 7.54 and Xu (2010) proposed a character-based movie synopsis 453 4.27 7,868 17.36 recap 619 17.19 5,892 9.52 summarization approach by incorporating scripts Statistics of TVRecap.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages10 Page
-
File Size-