Chinese Poetry Generation with Recurrent Neural Networks

Chinese Poetry Generation with Recurrent Neural Networks

Chinese Poetry Generation with Recurrent Neural Networks Xingxing Zhang and Mirella Lapata Institute for Language, Cognition and Computation School of Informatics, University of Edinburgh 10 Crichton Street, Edinburgh EH8 9AB [email protected], [email protected] Abstract ø 思 Missing You We propose a model for Chinese poem ¢ F 生 W 国, ( Z P P Z) generation based on recurrent neural net- * works which we argue is ideally suited to Red berries born in the warm southland. capturing poetic content and form. Our % e 发 几 枝枝枝? (P P Z Z P) generator jointly performs content selec- How many branches flush in the spring? tion (“what to say”) and surface realization ? 君 多 Ç ·, (* P P Z Z) (“how to say”) by learning representations Take home an armful, for my sake, of individual characters, and their com- d 思思思 binations into one or more lines as well i 最 ø 。 (* Z Z P P) as how these mutually reinforce and con- As a symbol of our love. strain each other. Poem lines are gener- ated incrementally by taking into account Table 1: An example of a 5-char quatrain ex- the entire history of what has been gen- hibiting one of the most popular tonal patterns. erated so far rather than the limited hori- The tone of each character is shown at the end of zon imposed by the previous line or lexical each line (within parentheses); P and Z are short- n-grams. Experimental results show that hands for Ping and Ze tones, respectively; * indi- our model outperforms competitive Chi- cates that the tone is not fixed and can be either. nese poetry generation systems using both Rhyming characters are shown in boldface. automatic and manual evaluation methods. whereas there are no rhyming constraints for the 1 Introduction third line. Moreover, poems must follow a pre- Classical poems are a significant part of China’s scribed tonal pattern. In traditional Chinese, ev- cultural heritage. Their popularity manifests itself ery character has one tone, Ping (level tone) or Ze in many aspects of everyday life, e.g., as a means (downward tone). The poem in Table 1 exempli- of expressing personal emotion, political views, fies one of the most popular tonal patterns (Wang, or communicating messages at festive occasions 2002). Besides adhering to the above formal crite- as well as funerals. Amongst the many differ- ria, poems must exhibit concise and accurate use ent types of classical Chinese poetry, quatrain and of language, engage the reader/hearer, stimulate regulated verse are perhaps the best-known ones. their imagination, and bring out their feelings. Both types of poem must meet a set of structural, In this paper we are concerned with generat- phonological, and semantic requirements, render- ing traditional Chinese poems automatically. Al- ing their composition a formidable task left to the though computers are no substitute for poetic cre- very best scholars. ativity, they can analyze very large online text An example of a quatrain is shown in Table 1. repositories of poems, extract statistical patterns, Quatrains have four lines, each five or seven char- maintain them in memory and use them to gen- acters long. Characters in turn follow specific erate many possible variants. Furthermore, while phonological patterns, within each line and across amateur poets may struggle to remember and ap- lines. For instance, the final characters in the sec- ply formal tonal and structural constraints, it is rel- ond, fourth and (optionally) first line must rhyme, atively straightforward for the machine to check 670 Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 670–680, October 25-29, 2014, Doha, Qatar. c 2014 Association for Computational Linguistics whether a candidate poem conforms to these re- example, the Haiku poem generator presented in quirements. Poetry generation has received a fair Wu et al. (2009) and Tosa et al. (2008) produces amount of attention over the past years (see the poems by expanding user queries with rules ex- discussion in Section 2), with dozens of computa- tracted from a corpus and additional lexical re- tional systems written to produce poems of vary- sources. Netzer et al. (2009) generate Haiku ing sophistication. Beyond the long-term goal of with Word Association Norms, Agirrezabal et building an autonomous intelligent system capa- al. (2013) compose Basque poems using patterns ble of creating meaningful poems, there are po- based on parts of speech and WordNet (Fellbaum, tential short-term applications for computer gen- 1998), and Oliveira (2012) presents a generation erated poetry in the ever growing industry of elec- algorithm for Portuguese which leverages seman- tronic entertainment and interactive fiction as well tic and grammar templates. as in education. An assistive environment for A second line of research uses genetic algo- poem composition could allow teachers and stu- rithms for poem generation (Manurung, 2003; dents to create poems subject to their require- Manurung et al., 2012; Zhou et al., 2010). Ma- ments, and enhance their writing experience. nurung et al. (2012) argue that at a basic level We propose a model for Chinese poem genera- all (machine-generated) poems must satisfy the tion based on recurrent neural networks. Our gen- constraints of grammaticality (i.e., a poem must erator jointly performs content selection (“what syntactically well-formed), meaningfulness (i.e., a to say”) and surface realization (“how to say”). poem must convey a message that is meaningful Given a large collection of poems, we learn repre- under some interpretation) and poeticness (i.e., a sentations of individual characters, and their com- poem must exhibit features that distinguishes it binations into one or more lines as well as how from non-poetic text, e.g., metre). Their model these mutually reinforce and constrain each other. generates several candidate poems and then uses Our model generates lines in a poem probabilis- stochastic search to find those which are grammat- tically: it estimates the probability of the current ical, meaningful, and poetic. line given the probability of all previously gener- A third line of research draws inspiration from ated lines. We use a recurrent neural network to statistical machine translation (SMT) and re- learn the representations of the lines generated so lated text-generation applications such as sum- far which in turn serve as input to a recurrent lan- marization. Greene et al. (2010) infer meters guage model (Mikolov et al., 2010; Mikolov et al., (stressed/unstressed syllable sequences) from a 2011b; Mikolov et al., 2011a) which generates the corpus of poetic texts which they subsequently current line. In contrast to previous approaches use for generation together with a cascade of (Greene et al., 2010; Jiang and Zhou, 2008), our weighted finite-state transducers interpolated with generator makes no Markov assumptions about the IBM Model 1. Jiang and Zhou (2008) generate dependencies of the words within a line and across Chinese couplets (two line poems) using a phrase- lines. based SMT approach which translates the first line We evaluate our approach on the task of qua- to the second line. He et al. (2012) extend this al- train generation (see Table 1 for a human-written gorithm to generate four-line quatrains by sequen- example). Experimental results show that our tially translating the current line from the previous model outperforms competitive Chinese poetry one. Yan et al. (2013) generate Chinese quatrains generation systems using both automatic and man- based on a query-focused summarization frame- ual evaluation methods. work. Their system takes a few keywords as input and retrieves the most relevant poems from a cor- 2 Related Work pus collection. The retrieved poems are segmented Automated poetry generation has been a popular into their constituent terms which are then grouped research topic over the past decades (see Colton into clusters. Poems are generated by iteratively et al. (2012) and the references therein). Most ap- selecting terms from clusters subject to phonolog- proaches employ templates to construct poems ac- ical, structural, and coherence constraints. cording to a set of constraints (e.g., rhyme, me- Our approach departs from previous work in ter, stress, word frequency) in combination with two important respects. Firstly, we model the tasks corpus-based and lexicographic resources. For of surface realization and content selection jointly 671 Keywords Candidate lines ShiXueHanYing %(spring) 暖 Î 迟 日 醉 First line 56(lute) spring º ~ m º 声 generation 醉(drunk) lute drunk ... Next line generation Line 4 Line 3 Line 2 Line 1 Figure 1: Poem generation with keywords spring, lute, and drunk. The keywords are expanded into phrases using a poetic taxonomy. Phrases are then used to generate the first line. Following lines are generated by taking into account the representations of all previously generated lines. using recurrent neural networks. Structural, se- ics. The generator creates the first line of the poem mantic, and coherence constraints are captured based on these keywords. Subsequent lines are naturally in our framework, through learning the generated based on all previously generated lines, representations of individual characters and their subject to phonological (e.g., admissible tonal pat- combinations. Secondly, generation proceeds by terns) and structural constraints (e.g., whether the taking into account multi-sentential context rather quatrain is five or seven characters long). than the immediately preceding sentence. Our To create the first line, we select all phrases work joins others in using continuous representa- corresponding to the user’s keywords and gener- tions to express the meaning of words and phrases ate all possible combinations satisfying the tonal (Socher et al., 2012; Mikolov et al., 2013) and pattern constraints. We use a language model to how these may be combined in a language mod- rank the generated candidates and select the best- eling context (Mikolov and Zweig, 2012).

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    11 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us