Hwichan Kim / 金輝燦
Email : hwichan[at]komachi.live
Hobby : Work out
Interests
Machine Translation
Multilingual NLP
Data efficiency
Pruning
Publications
Journal
Hwichan Kim, Hirasawa Tosho, Sangwhan Moon, Naoaki Okazaki, Mamoru Komachi. North Korean Neural Machine Translation through South Korean Resources. ACM Transactions on Asian and Low-Resource Language Information Processing. Vol. 22, No. 9, Sep., 2023. [paper]
International Conference/Workshop (Refereed)
Hwichan Kim, Shota Sasaki, Sho Hoshino, Ukyo Honda. A Single Linear Layer Yields Task-Adapted Low-Rank Matrices. In the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation. May, 20-25, 2024. [paper]
This work was carried out during the CyberAgent internship period.
Taisei Enomoto, Tosho Hirasawa, Hwichan Kim, Teruaki Oka, Mamoru Komachi. Simultaneous Domain Adaptation of Tokenization and Machine Translation. In Proceedings of the 37th Pacific Asia Conference on Language, Information and Computation. Dec, 2–5, 2023. [paper]
Hwichan Kim, Mamoru Komachi. Enhancing Few-shot Cross-lingual Transfer with Target Language Peculiar Examples. In Findings of the Association for Computational Linguistics (2023). July, 9-14, 2023. [paper][poster]
Hiroto Tamura, Tosho Hirasawa, Hwichan Kim, Mamoru Komachi. Does Masked Language Model Pre-training with Artificial Data Improve Low-resource Neural Machine Translation? In Findings of the European Chapter of the Association for Computational Linguistics (2023). May, 2-4, 2023. [paper]
Hwichan Kim, Sangwhan Moon, Naoaki Okazaki, Mamoru Komachi. Learning How to Translate North Korean through South Korean. In Proceedings of the 13th Conference on Language Resources and Evaluation (LREC 2022). June, 20-25, 2022. [paper][poster]
Hwichan Kim, Mamoru Komachi. Can Monolingual Pre-trained Encoder-Decoder Improve NMT for Distant Language Pairs? In Proceedings of the 35th Pacific Asia Conference on Language, Information and Computation (PACLIC 2021). November 5-7, 2021. [paper][slide]
Hwichan Kim, Tosho Hirasawa, Mamoru Komachi. Zero-shot North Korean to English Neural Machine Translation by Character Tokenization and Phoneme Decomposition. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics Student Research Workshop (ACL SRW 2020). July 6-8, 2020. Acceptance rate. [paper] [slide]
International Conference/Workshop (Non-refereed)
Hwichan Kim,Mamoru Komachi. TMU NMT System with Japanese BART for the Patent task of WAT 2021. In Proceedings of the 8th Workshop on Asian Translation (WAT 2021). August 6, 2021. [paper][slide]
Hwichan Kim, Tosho Hirasawa, Mamoru Komachi. Korean-to-Japanese Neural Machine Translation System using Hanja Information. In Proceedings of the 7th Workshop on Asian Translation (WAT 2020). December 4, 2020. [paper] [slide]
Domestic Conference/Symposium
木山朔, 金輝燦, 平澤寅庄, 岡照晃, 小町守. Decoderのみを用いた機械翻訳モデルの分析. 言語処理学会第29回年次大会(NLP2023). 2023年3月14-16日. [paper]
榎本大晟, 平澤寅庄, 金輝燦, 岡照晃, 小町守. ニューラル機械翻訳における単語分割器のドメイン適応. 言語処理学会第29回年次大会(NLP2023). 2023年3月14-16日. [paper]
佐藤郁子, 平澤寅庄, 金輝燦, 岡照晃, 小町守. マルチモーダル機械翻訳における画像・入力文間類似度と翻訳品質の相関の分析. 言語処理学会第29回年次大会(NLP2023). 2023年3月14-16日. [paper]
木山朔, 金輝燦, 平澤寅庄, 岡照晃, 小町守. Causal言語モデルによる機械翻訳. NLP若手の会第17回シンポジウム (YANS2022). 2022年8月29日. [poster](備考:奨励賞受賞)
榎本大晟, 平澤寅庄, 金輝燦, 岡照晃, 小町守. 事前学習を用いる機械翻訳での単語分割同時最適化の検討. NLP若手の会第17回シンポジウム (YANS2022). 2022年8月29日. [poster]
中島京太郎, 金輝燦, 平澤寅庄, 岡照晃, 小町守. 訓練データからの文選択手法の機械翻訳における特徴分析. NLP若手の会第17回シンポジウム (YANS2022). 2022年8月29日. [poster]
佐藤郁子, 平澤寅庄, 金輝燦, 岡照晃, 小町守. 原文に対して補完的な画像はMMTモデルの翻訳精度を向上させるのか. NLP若手の会第17回シンポジウム (YANS2022). 2022年8月28日. [poster](備考:奨励賞受賞)
田村弘人, 平澤寅庄, 金輝燦, 岡照晃, 小町守. 人工データでの事前学習によるニューラル機械翻訳の性能向上. 言語処理学会第28回年次大会 (NLP2022). 2022年3月17日. [paper] [poster]
凌志棟, 相田太一, 金輝燦, 岡照晃, 小林千真, 小町守. 日本語BERTを用いた単語の用例の分野別分析ツールの開発. 言語処理学会第28回年次大会 (NLP2022). 2022年3月16日. [paper]
金輝燦, 平澤寅庄, 小町守. 韓国語対訳データを利用した文字分割と音素分解による朝鮮語ニューラル機械翻訳. 言語処理学会第26回年次大会 (NLP2020). 2020年3月17日. [paper] [poster]
Professional Experience
Part-time NLP engineer at Rimo Voice (September 2021 - Present)
Information extraction from automatically generated Japanese transcriptions.
Research Assistant at Hitotsubashi University (August 2023 - Present)
Contract Research with Cabinet Office, Government of Japan (August 2023 - February 2024)
Research Internship at CyberAgent AI Lab (September 2023 - November 2023)
Research Assistant at Tokyo Metropolitan University (April 2022 - March 2023)
Internship at Insight Tech (August 2020 - September 2020)
Classification of Japanese customer reviews.
Grants and scholarships
“Sotsui” Fellowship Program for Doctoral Students (April 2022-March 2025)
The Korean Scholarship (April 2020-March 2022)
Reviewer
Peer
2023: IJCNLP-AACL
Secondary
2022: AACL-IJCNLP, WMT