Presenter Profile

Keisuke Sakaguchi

Associate Professor
Tohoku University, Graduate School of Information Science

Keisuke Sakaguchi is an associate professor at Tohoku University NLP lab. His research interests are in natural language processing, with a focus on commonsense knowledge acquisition and reasoning, designiing robust algorithms for noisy texts, building reliable resource and evaluation framework, and NLP for educational applications. Keisuke received his Ph.D. from Johns Hopkins University, and worked as a research scientist at Allen Institute for AI (AI2) before joining Tohoku University. His work has received the best paper award at AAAI 2020, and an outstanding paper award at ACL 2017.

TALK TITLE
Twist Decoding: Diverse Generators Guide Each Other

KEYWORDS
Language Generation Models, Twist Decoding, Ensemble Generation

ABSTRACT
Many language generation models are now available for a wide range of generation tasks, including machine translation and summarization. Combining such diverse models may lead to further progress, but ensembling generation models is challenging during inference: conventional ensembling methods (e.g., shallow fusion) require that the models share vocabulary/tokenization schemes. We introduce Twist decoding, a simple and general text generation algorithm that benefits from diverse models at inference time. Our method does not assume the vocabulary, tokenization or even generation order is shared. Our extensive evaluations on machine translation and scientific paper summarization demonstrate that Twist decoding substantially outperforms each model decoded in isolation over various scenarios, including cases where domain-specific and general-purpose models are both available. Twist decoding also consistently outperforms the popular reranking heuristic where output candidates from one model are rescored by another. We hope that our work will encourage researchers and practitioners to examine generation models collectively, not just independently, and to seek out models with complementary strengths to the currently available models.