Machine Translation with a Stochastic Grammatical Channel

Machine Translation with a Stochastic Grammatical Channel

Machine Translation with a Stochastic Grammatical Channel Dekai Wu and Hongsing WONG HKUST Human Language Technology Center Department of Computer Science University of Science and Technology Clear Water Bay, Hong Kong {dekai,wong}@cs.ust.hk Abstract word alignments. The SBTG can be regarded as We introduce a stochastic grammatical channel a model of the language-universal hypothesis that model for machine translation, that synthesizes sev- closely related arguments tend to stay together (Wu, eral desirable characteristics of both statistical and 1995a; Wu, 1995b). grammatical machine translation. As with the In this paper we introduce a generalization of pure statistical translation model described by Wu Wu's method with the objectives of (1996) (in which a bracketing transduction gram- 1. increasing translation speed flmher, mar models the channel), alternative hypotheses 2. improving meaning-preservation accuracy, compete probabilistically, exhaustive search of the 3. improving grammaticality of the output, and translation hypothesis space can be performed in 4. seeding a natural transition toward transduc- polynomial time, and robustness heuristics arise tion role models, naturally from a language-independent inversion- under the constraint of transduction model. However, unlike pure statisti- • employing no additional knowledge resources cal translation models, the generated output string except a grammar for the target language. is guaranteed to conform to a given target gram- To achieve these objectives, we: mar. The model employs only (1) a translation • replace Wn's SBTG channel with a full lexicon, (2) a context-free grammar for the target stochastic inversion transduction grammar or language, and (3) a bigram language model. The SITG channel, discussed in Section 3, and fact that no explicit bilingual translation roles are • (mis-)use the target language grammar as a used makes the model easily portable to a variety of SITG, discussed in Section 4. source languages. Initial experiments show that it also achieves significant speed gains over our ear- In Wu's SBTG method, the burden of generating lier model. grammatical output rests mostly on the bigram lan- guage model; explicit grammatical knowledge can- 1 Motivation not be used. As a result, output grammaticality can- Speed of statistical machine translation methods not be guaranteed. The advantage is that language- has long been an issue. A step was taken by dependent syntactic knowledge resources are not Wu (Wu, 1996) who introduced a polynomial-time needed. algorithm for the mntime search for an optimal We relax those constraints here by assuming a translation. To achieve this, Wu's method substi- good (monolingual) context-free grammar for the tuted a language-independent stochastic bracketing target language. Compared to other knowledge transduction grammar (SBTG) in place of the sim- resources (such as transfer rules or semantic on- pler word-alignment channel models reviewed in tologies), monolingual syntactic grammars are rel- Section 2. The SBTG channel made exhaustive atively easy to acquire or construct. We use the search possible through dynamic programming, in- grammar in the SITG channel, while retaining the stead of previous "stack search" heuristics. Trans- bigram language model. The new model facilitates lation accuracy was not compromised, because the explicit coding of grammatical knowledge and finer SBTG is apparently flexible enough to model word- control over channel probabilities. Like Wu's SBTG order variation (between English and Chinese) even model, the translation hypothesis space can be ex- though it eliminates large portions of the space of haustively searched in polynomial time, as shown in 1408 Section 5. The experiments discussed in Section 6 3 A SITG ChannelModel show promising results for these directions. The translation channel we propose is based on 2 Review: Noisy Channel Model the recently introduced bilingual language model- ing approach. The model employs a stochastic ver- The statistical translation model introduced by IBM sion of an inversion transduction grammar or ITG (Brown et al., 1990) views translation as a noisy (Wu, 1995c; Wu, 1995d; Wu, 1997). This formal- channel process. The underlying generative model ism was originally developed for the purpose of par- contains a stochastic Chinese (input) sentence gen- allel corpus annotation, with applications for brack- erator whose output is "corrupted" by the transla- eting, alignment, and segmentation. Subsequently, tion channel to produce English (output) sentences. a method was developed to use a special case of the Assume, as we do throughout this paper, that the ITG--the aforementioned BTG--for the translation input language is English and the task is to trans- task itself (Wu, 1996). The next few paragraphs late into Chinese. In the IBM system, the language briefly review the main properties of ITGs, before model employs simple 'n-grams, while the transla- we describe the SITG channel. tion model employs several sets of parameters as An ITG consists of context-free productions discussed below. Estimation of the parameters has where terminal symbols come in couples, for ex- been described elsewhere (Brown et al., 1993). ample :c/9, where x is a English word and U is an Translation is performed in the reverse direction Chinese translation of z, with singletons of the form from generation, as usual for recognition under gen- x/~ or ~/9 representing function words that are used erative models. For each English sentence e to be in only one of the languages. Any parse tree thus translated, the system attempts to find the Chinese generates both English and Chinese strings simulta- sentence e. such that: neously. Thus, the tree: c* == argtnax Pr(e[e) = argmax l'r(e[e)l'r(c) (1) c c (1) [I/~ [[took/~T [aJ-- ~/ak book/~]Np ]vP [for/,,~ you/4~]pp ]vP IS In the IBM model, the search for the optimal e. is produces, for example, the mutual translations: performed using a best-first heuristic "stack search" (2) a. [~ [[~" [~74kN~:~]Np ]VP [,,~4~]PP ]VP ]S similar to A* methods. b. [1 [[took [a book]Np ]vp [for you]pp ]vp Is One of the primary obstacles to making the statis- An additional mechanism accommodates a con- tical translation approach practical is slow speed of servative degree of word-order variation between translation, as performed in A* l'ashion. This price the two languages. With each production of the is paid for the robustness that is obtained by using gralnmar is associated either a straight orientation very flexible language and translation models. The or an im, erted orientation, respectively denoted as language model allows sentences of arbitrary or- follows: VP -+ [VPPP] der and the translation model allows arbitrary word- VP -+ {VP PP} order permutation. No structural constraints and In the case of a production with straight orien- explicit linguistic grammars are imposed by this tation, the right-hand-side symbols are visited left- model. to-right for both the English and Chinese streams. The translation channel is characterized by two But for a production with inverted orientation, the sets of parameters: translation and alignment prob- right-hand-side symbols are visited left-to-right for abilities, l The translation probabilities describe lex- English and right-to-left for Chinese. Thus, the tree: ical substitution, while alignment probabilities de- (3) [I/~j~ ([took/~" [aJ-- ~/~ book/-~]N p ]Vl, scribe word-order permutation. The key problem [for/.,~ you/4~]pp)ve ]s is that the formulation of alignment probabilities produces translations with different word order: a(ilj, V, T) permits the English word in position j (4) a. [I [[took [a book]Np ]vP [for you]pp ]ve ]s of a length-T sentence to map to any position i of a length-V Chinese sentence. So V g alignments are b. [~ [[,.~4/~,]I,p [~T [--*NINP ]vp ]ve ]s The surprising ability of ITGs to accommodate possible, yielding an exponential space with corre- nearly all word-order variation between fixed-word- spondingly slow search times. order languages 2 (English and Chinese in particu- 'Various models have been constructed by the IBM team lar), has been analyzed mathematically, linguisti- (Brown et al., 1993). This description corresponds to one of the simplest ones, "Model 2"; search costs for the more complex 2With the except{on of higher-order phenomena such as models are corrcspondingly higher. neg-raising and wh-movcment. 1409 cally, and experimentally (Wu, 1995b; Wu, 1997). S -} NP VP Punc Any ITG can be transformed to an equivalent VP --+ V NP binary-branching normal form. NP -+ NModNIPrn A stochastic ITG associates a probability with S --4 [NPVPPunc]l(PuncVPNP) each production. It follows that a SITG assigns VP -+ [V NP]I (NPV) a probability Pr(e,e,q) to all generable trees q NP -9 [N Mod N]I (NMod N) I [Prn] and sentence-pairs. In principle it can be used as Figure 1: An input CFG and its mirrored ITG. the translation channel model by normalizing with Pr(c) and integrating out Pr(q) to give Pr(e[c) in Equation (1). In practice, a strong language model 4.1 Production Mirroring makes this unnecessary, so we can instead optimize The first step is to convert the monolingual Chi- the simpler Viterbi approximation nese CFG to a bilingual ITG. The production mir- c* = argmaxPr(e,c, q) Pr(c) (2) roring tactic simply doubles the number of pro- e ductions, transforming every monolingual produc- To complete the picture we add a bigram model tion into two bilingual productions, a one straight g~a_~a = g(cj I cj-1) for the Chinese language and one inverted, as for example in Figure 1 where model Pr(c). the upper Chinese CFG becomes the lower ITG. This approach was used for the SBTG chan- The intent of the mirroring is to add enough flex- nel (Wu, 1996), using the language-independent ibility to allow parsing of English sentences using bracketing degenerate case of the SITG: 3 the language 1 side of the ITG.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    8 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us