<first name>oo.[last name] {at} gmail (dot) com - four o's!
Google Scholar | LinkedIn | Resume
Professional Experience
Machine Learning Engineer @ Meta (June 2024 - Present)
Senior Machine Learning Engineer @ Qualcomm (July 2023 - May 2024)
Intern, Siri @ Apple (May. 2022 - Aug 2022)
Research Engineer, Machine translation @ Samsung Research (Aug. 2017 - July 2021)
Education
M.C.S., Computer Science at Texas A&M (May 2023)
B.S., Computer Science at Yonsei University (Aug. 2017)
My name is Insoo Chung, and I am interested in applying efficient NN methods to address real-life problems :)
Currently, I work at Meta in the Monetization & GenAI team. I work on LLM-powered tools that help advertisers with their ad performances across Facebook and Instagram.
I previously worked on on-device neural network optimization, including low-bit quantization for on-device transformers. In Samsung, I worked on research and developments of on-device AI products that help users of the Galaxy smartphone series, and in Qualcomm, I worked on optimization for on-device inference of large language models (LLMs) and their evaluations. I have published relevant work in ACL, EMNLP, ICASSP, and WMT.
Hyo Jung Han*,ย Seok Chan Ahn*, Yoonjung Choi, Insoo Chung, Sangha Kim, and Kyunghyun Cho. 2021. Monotonic Simultaneous Translation with Chunk-wise Reordering and Refinement. In Proceedings of the Sixth Conference on Machine Translation (WMT2021). [pdf]
Insoo Chung*, Byeongwook Kim*, Yoonjung Choi, Se Jung Kwon, Yongkwon Jeon, Baesung Park, Sangha Kim and Dongsoo Lee. 2020. Extremely Low Bit Transformer Quantization for On-Device Neural Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2020 (Findings of EMNLP2020). [pdf] [slides]
Sathish Indurthi, Houjeung Han, Nikhil Lakumarapu, Beomseok Lee, Insoo Chung, Sangha Kim and Chanwoo Kim. 2020. End-end Speech-to-Text Translation with Modality Agnostic Meta-Learning. In ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP2020, short). [pdf]
Sathish Indurthi, Insoo Chung, Sangha Kim. 2019. Look Harder: A Neural Machine Translation Model with Hard Attention. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL2019, short). [pdf]