A Large Scale Distributed Syntactic, Semantic and Lexical Language Model for Machine Translation

Ming Tan,  Wenli Zhou,  Lei Zheng,  Shaojun Wang
Wright State University


Abstract

This paper presents an attempt at building a large scale distributed composite language model that simultaneously accounts for local word lexical information, mid-range sentence syntactic structure, and long-span document semantic content under a directed Markov random field paradigm. The composite language model has been trained by performing a convergent N-best list approximate EM algorithm that has linear time complexity and a follow-up EM algorithm to improve word prediction power on corpora with up to a billion tokens and stored on a supercomputer. The large scale distributed composite language model gives drastic perplexity reduction over n-grams and achieves significantly better translation quality measured by the BLEU score and ``readability'' when applied to the task of re-ranking the N-best list from a state-of-the-art parsing-based machine translation system.




Full paper: http://www.aclweb.org/anthology/P/P11/P11-1021.pdf