Google Scholar / ResearchGate / LinkedIn / Full CV

Email: jihong.park (at) deakin.edu.au

Brief Bio:  Jihong Park is a Lecturer at the School of IT, Deakin University, Australia. He received the B.S. and Ph.D. degrees from Yonsei University, Seoul, Korea, in 2009 and 2016, respectively. He was a Post-Doctoral Researcher with Aalborg University, Denmark, from 2016 to 2017; the University of Oulu, Finland, from 2018 to 2019. His recent research focus includes AI-native semantic communication and distributed machine learning for 6G. He served as a Conference/Workshop Program Committee Member for IEEE GLOBECOM, ICC, and INFOCOM, as well as NeurIPS, ICML, and IJCAI. He received 2023 IEEE Communication Society Heinrich Hertz Award, 2022 FL-IJCAI Best Student Paper Award, 2014 IEEE GLOBECOM Student Travel Grant, 2014 IEEE Seoul Section Student Paper Prize, and 2014 IDIS-ETNEWS Paper Award. Recently, he co-chaired the 2023 IEEE GLOBECOM Symposium on Machine Learning for Communications. Dr. Park is an Editor of IEEE Transactions on Communications, a Senior Member of IEEE, and named in the list of Stanford's World Top 2% Scientists.

Recent Updates

Updates in 2020-2023

Selected Publications

Wireless Network Intelligence at the Edge

J. Park, S. Samarakoon, M. Bennis, and M. Debbah. Proceedings of the IEEE, Nov. 2019.Cover Article in Nov. 2019, PIEEE Top 50 Popular Articles during Nov. 2019-Jun. 2020, ResearchGate Top 1% Articles Published in 2019.

TLDR: "Learn to Communicate" and "Communicate to Learn" are equally important for B5G/6G.

Blockchained On-Device Federated Learning

H. Kim, J. Park, M. Bennis, and S.-L. Kim. IEEE Communications Letters, Jun. 2020.Top 50 Popular Articles during Jun. 2020-Present, ResearchGate Top 1% Articles Uploaded in 2019.

TLDR: Blockchain allows federated learning to robustly exchange model updates while giving rewards.

GADMM: Fast and Communication Efficient Distributed Machine Learning Framework

A. Elgabli, J. Park, A. Bedi, V. Aggarwal, and M. Bennis, Journal of Machine Learning Research (JMLR), Apr. 2020.Acceptance rate: 11.9%, ResearchGate Top 3% Articles Uploaded in 2019.

TLDR: Exchanging model updates only with neighbors guarantees the convergence of distributed learning.

Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data

E. Jeong, S. Oh, H. Kim, J. Park, M. Bennis, and S.-L. Kim. NeurIPS MLPCD, Dec. 2018.

TLDR: Exchanging model outputs suffices to enable federated learning; Federated GAN training resolves non-IID data problems.

Tractable Resource Management with Uplink Decoupled Millimeter-Wave Overlay in Ultra-Dense Cellular Networks

J. Park, S.-L. Kim, and J. Zander. IEEE Transactions on Wireless Communications, Jun. 2016.ResearchGate Top 1% Articles Uploaded in 2016.

TLDR: DL and UL data rates in an ultra-dense 5G system are a log-function of BS-to-user density ratio; Uplink mmWave is a bottleneck, calling for the aid from BS densification and the legacy frequency bands.