Insoo Chung (์ ์ธ์)
<first name>oo.[last name] {at} gmail (dot) com - four o's!
Google Scholar | LinkedIn | Resume
Professional Experience
Machine Learning Engineer @ Meta (June 2024 - Present)
Senior Machine Learning Engineer @ Qualcomm (July 2023 - May 2024)
Intern, Siri @ Apple (May. 2022 - Aug 2022)
Research Engineer, Machine translation @ Samsung Research (Aug. 2017 - July 2021)
Education
M.C.S., Computer Science at Texas A&M (May 2023)
B.S., Computer Science at Yonsei University (Aug. 2017)
Interests - efficient transformers!
My name is Insoo Chung, and I am interested in applying efficient NN methods to address real-life problems :)
Currently, I work at Meta in the Monetization & GenAI team.
My research focuses on on-device neural network optimization, including low-bit quantization for on-device transformers. At Samsung, my research contributed to on-device AI product development for Galaxy smartphone series. At Qualcomm, I specialized in transformer optimization for on-device large language models (LLMs) and their evaluations. I have published relevant work in ACL, EMNLP, ICASSP, and WMT.
Publications
Hyo Jung Han*,ย Seok Chan Ahn*, Yoonjung Choi, Insoo Chung, Sangha Kim, and Kyunghyun Cho. 2021. Monotonic Simultaneous Translation with Chunk-wise Reordering and Refinement. In Proceedings of the Sixth Conference on Machine Translation (WMT2021). [pdf]
Insoo Chung*, Byeongwook Kim*, Yoonjung Choi, Se Jung Kwon, Yongkwon Jeon, Baesung Park, Sangha Kim and Dongsoo Lee. 2020. Extremely Low Bit Transformer Quantization for On-Device Neural Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2020 (Findings of EMNLP2020). [pdf] [slides]
Sathish Indurthi, Houjeung Han, Nikhil Lakumarapu, Beomseok Lee, Insoo Chung, Sangha Kim and Chanwoo Kim. 2020. End-end Speech-to-Text Translation with Modality Agnostic Meta-Learning. In ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP2020, short). [pdf]
Sathish Indurthi, Insoo Chung, Sangha Kim. 2019. Look Harder: A Neural Machine Translation Model with Hard Attention. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL2019, short). [pdf]