MT and Literary Tanslation - Andy Way

Post date: May 29, 2019 10:19:18 AM

Image result for Andy Way

I am at the Computer-Assisted Literary Translation (CALT) Workshop at Swansea University listening to a talk by Andy Way (Dublin City U) on Machine Translation( MT) and Literary Translation. He just showed an amazing number: 143280496726 words are being translated through Google translate every day! Andy agrees that MT might still need a lot of work but this number does show that MT is here to stay.

He talked about how Neural Machine Translation (NMT) works much better than Phrase-Based Machine translation (also called SMT - Statistical Machine Translation). Not only does it do the job better, but also the post editing needed to brush up the output into natural-sounding language. For SMT one needs a large corpus of ST+TT examples which is the Translation Model and a large corpus of books in the target language for the fluency/language model. NMT only needs the ST+TT examples.

Andy showed his work on MT and proved how there is a huge difference in success between translating different authors (with James Joyce being the most difficult to translate). The MT produced were rated by human native speakers and the translations by human translators, PBT and NMT were compared and there is real promise in NMT. Checking the time saved using NMT with post editing showed that for most professional translators, using NMT could save them a lot of time.

Professional translators who were asked to try NMT in an experiment Andy set up did not like using MT even though keystroke logs showed that it does save them time.

Thank you Andy for mentioning me and my research several times during your talk.