Generating Classical Chinese Poems from Vernacular Chinese

Generating Classical Chinese Poems from Vernacular Chinese

Generating Classical Chinese Poems from Vernacular Chinese Zhichao Yang1,∗ Pengshan Cai1∗, Yansong Feng2, Fei Li1, Weijiang Feng3, Elena Suet-Ying Chiu1, Hong Yu1 1 University of Massachusetts, MA, USA fzhichaoyang, [email protected] [email protected] [email protected] hong [email protected] 2 Institute of Computer Science and Technology, Peking University, China [email protected] 3 College of Computer, National University of Defense Technology, China [email protected] Abstract Since users could only interfere with the semantic of generated poems using a few input words, mod- Classical Chinese poetry is a jewel in the trea- els control the procedure of poem generation. In sure house of Chinese culture. Previous poem this paper, we proposed a novel model for classical generation models only allow users to employ Chinese poem generation. As illustrated in Figure keywords to interfere the meaning of gener- ated poems, leaving the dominion of gener- 1, our model generates a classical Chinese poem ation to the model. In this paper, we pro- based on a vernacular Chinese paragraph. Our ob- pose a novel task of generating classical Chi- jective is not only to make the model generate aes- nese poems from vernacular, which allows thetic and terse poems, but also keep rich semantic users to have more control over the semantic of the original vernacular paragraph. Therefore, of generated poems. We adapt the approach our model gives users more control power over the of unsupervised machine translation (UMT) to semantic of generated poems by carefully writing our task. We use segmentation-based padding and reinforcement learning to address under- the vernacular paragraph. translation and over-translation respectively. Although a great number of classical poems and According to experiments, our approach sig- vernacular paragraphs are easily available, there nificantly improve the perplexity and BLEU exist only limited human-annotated pairs of poems compared with typical UMT models. Further- and their corresponding vernacular translations. more, we explored guidelines on how to write Thus, it is unlikely to train such poem generation the input vernacular to generate better poems. Human evaluation showed our approach can model using supervised approaches. Inspired by generate high-quality poems which are com- unsupervised machine translation (UMT) (Lam- parable to amateur poems. ple et al., 2018b), we treated our task as a transla- tion problem, namely translating vernacular para- 1 Introduction graphs to classical poems. However, our work is not just a straight-forward During thousands of years, millions of classical application of UMT. In a training example for Chinese poems have been written. They contain UMT, the length difference of source and target ancient poets’ emotions such as their apprecia- languages are usually not large, but this is not true tion for nature, desiring for freedom and concerns in our task. Classical poems tend to be more con- for their countries. Among various types of clas- cise and abstract, while vernacular text tends to be sical poetry, quatrain poems stand out. On the detailed and lengthy. Based on our observation on one hand, their aestheticism and terseness exhibit gold-standard annotations, vernacular paragraphs unique elegance. On the other hand, composing usually contain more than twice as many Chinese such poems is extremely challenging due to their characters as their corresponding classical poems. phonological, tonal and structural restrictions. Therefore, such discrepancy leads to two main Most previous models for generating classical problems during our preliminary experiments: (1) Chinese poems (He et al., 2012; Zhang and Lap- Under-translation: when summarizing vernac- ata, 2014) are based on limited keywords or char- ular paragraphs to poems, some vernacular sen- acters at fixed positions (e.g., acrostic poems). tences are not translated and ignored by our model. ∗Equal contribution Take the last two vernacular sentences in Figure 6155 Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, pages 6155–6164, Hong Kong, China, November 3–7, 2019. c 2019 Association for Computational Linguistics Figure 1: An example of the training procedures of our model. Here we depict two procedures, namely back translation and language modeling. Back translation has two paths, namely ES ! DT ! ET ! DS and DT ! ES ! DS ! ET . Language modeling also has two paths, namely ET ! DT and ES ! DS. Figure1 shows only the former one for each training procedure. 1 as examples, they are not covered in the gener- the effectiveness of our models and explored how ated poem. (2) Over-translation: when expand- to write the input vernacular to inspire better po- ing poems to vernacular paragraphs, certain words ems. Human evaluation shows our models are able are unnecessarily translated for multiple times. to generate high quality poems, which are compa- For example, the last sentence in the generated rable to amateur poems. poem of Figure1, as green as sapphire, is back- translated as as green as as as sapphire. 2 Related Works Inspired by the phrase segmentation schema Classical Chinese Poem Generation Most pre- in classical poems (Ye, 1984), we proposed the vious works in classical Chinese poem genera- method of phrase-segmentation-based padding to tion focus on improving the semantic coherence of handle with under-translation. By padding po- generated poems. Based on LSTM, Zhang and La- ems based on the phrase segmentation custom pata (2014) purposed generating poem lines incre- of classical poems, our model better aligns po- mentally by taking into account the history of what ems with their corresponding vernacular para- has been generated so far. Yan (2016) proposed graphs and meanwhile lowers the risk of under- a polishing generation schema, each poem line is translation. Inspired by Paulus et al.(2018), we generated incrementally and iteratively by refining designed a reinforcement learning policy to penal- each line one-by-one. Wang et al. (2016) and Yi et ize the model if it generates vernacular paragraphs al. (2018) proposed models to keep the generated with too many repeated words. Experiments show poems coherent and semantically consistent with our method can effectively decrease the possibility the user’s intent. There are also researches that fo- of over-translation. cus on other aspects of poem generation. (Yang et The contributions of our work are threefold: al. (2018) explored increasing the diversity of gen- (1) We proposed a novel task for unsupervised erated poems using an unsupervised approach. Xu Chinese poem generation from vernacular text. et al. (2018) explored generating Chinese poems (2) We proposed using phrase-segmentation- from images. While most previous works gener- based padding and reinforcement learning to ad- ate poems based on topic words, our work targets dress two important problems in this task, namely at a novel task: generating poems from vernacular under-translation and over-translation. Chinese paragraphs. (3) Through extensive experiments, we proved Unsupervised Machine Translation Compared 6156 with supervised machine translation approaches modeling and back-translation. We will give de- (Cho et al., 2014; Bahdanau et al., 2015), un- tailed introduction to each procedure. supervised machine translation (Lample et al., Parameter initialization As both vernacular and 2018a,b) does not rely on human-labeled parallel classical poem use Chinese characters, we initial- corpora for training. This technique is proved to ize the character embedding of both languages in greatly improve the performance of low-resource one common space, the same character in two lan- languages translation systems. (e.g. English-Urdu guages shares the same embedding. This initial- translation). The unsupervised machine transla- ization helps associate characters with their plau- tion framework is also applied to various other sible translations in the other language. tasks, e.g. image captioning (Feng et al., 2019), Language modeling It helps the model generate text style transfer (Zhang et al., 2018), speech texts that conform to a certain language. A well- to text translation (Bansal et al., 2017) and clin- trained language model is able to detect and cor- ical text simplification (Weng et al., 2019). The rect minor lexical and syntactic errors. We train UMT framework makes it possible to apply neu- the language models for both vernacular and clas- ral models to tasks where limited human labeled sical poem by minimizing the following loss: data is available. However, in previous tasks that adopt the UMT framework, the abstraction levels Llm = [− log P (SjD (E (S ))]+ of source and target language are the same. This E s s N S2S (1) is not the case for our task. E [− log P (T jDt(Et(TN ))]; Under-Translation & Over-Translation Both T 2T are troublesome problems for neural sequence- where SN (or TN ) is generated by adding noise to-sequence models. Most previous related re- (drop, swap or blank a few words) in S (or T ). searches adopt the coverage mechanism (Tu et al., Back-translation Based on a vernacular para- 2016; Mi et al., 2016; Sankaran et al., 2016). graph S, we generate a poem TS using Es and Dt, However, as far as we know, there were no suc- we then translate TS back into a vernacular para- cessful attempt applying coverage mechanism to graph STS = Ds(Et(TS)). Here, S could be used

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us