Haoyi "Bertrand" Xiong 熊昊一, Ph.D
principal applied scientist
Microsoft
Personal Email: haoyi.xiong.fr@ieee.org
Since May 2024, Dr. Haoyi Xiong has been a principal applied scientist at Microsoft AI APRD. Previously, he served as a staff engineer (2018--2020), a principal architect (2020--2024), and then the manager (2022-2024) of Big Data Laboratory, Baidu Research. Prior to his industry endeavor, he was a tenure track assistant professor (2016--2018) in the Department of Computer Science, Missouri University of Science and Technology (Missouri S&T), Rolla, MO and a research associate (2015--2016), hosted by Prof. Laura E. Barnes, in the Department of Systems and Information Engineering, University of Virginia, Charlottesville, VA. He received Ph.D in electrical and computer engineering from Télécom SudParis (now a grande ecole of Institut Polytechnique de Paris) jointly with Université Pierre-et-Marie-Curie, Paris 6 (now the science faculty of Sorbonne Université), Paris, France in 2015, where he was supervised by Prof. Daqing Zhang, Dr. Vincent Gauthier, and Prof. Monique Becker (directeur de thèse, see my academic genealogy). He earned the master's degree in information technology from the Hong Kong University of Science and Technology (2010), and the bachelor's degree in electrical engineering from Huazhong University of Science and Technology (2009). He also serves as a graduate faculty scholar affiliated to the electrical and computer engineering PhD program of the University of Central Florida, Orlando, FL. He has been selected as the World's Top 2% Most Influential Scientists by Stanford University (2023).
"L'idée de Dieu. C'est l'idée d'un être infini et parfait.", René Descartes 1596 - 1650
Research Interests
Ubiquitous Computing, Data-centric AI, and Explainable AI
Recent Awards [Full List]
Publications [Full List] [Google Scholar] [DBLP] [Thèses]
Some technical works I collaborated with my colleagues.
[NMI'24] Multi-purpose RNA Language Modeling with Motif-aware Pre-training and Type-guided Fine-tuning. Nature Machine Intelligence, 2024. [PDF]
[ICML'24] GiLot: Interpreting Natural Language Generation via Optimal Transport. [PDF]
[IJCAI'24] Geometry-Guided Conditional Adaption for Surrogate Models of Large-Scale 3D PDEs on Arbitrary Geometries. [PDF]
[NeurIPS'23] M4: A Unified XAI Benchmark for Faithfulness Evaluation of Feature Attribution Methods across Metrics, Modalities and Models. [PDF]
[KDD'23] S2phere: Semi-Supervised Pre-training for Web Search over Heterogeneous Learning to Rank Data. [PDF]
[KDD'23] PGLBox: Multi-GPU Graph Learning Framework for Web-Scale Recommendation. [PDF]
[AIJ'23] G-LIME: Statistical Learning for Local Interpretations of Deep Neural Networks using Global Priors [PDF]
[JMLR'22] InterpretDL: Explaining Deep Models in PaddlePaddle [PDF]
[AAAI'22] AutoGCL: Automated Graph Contrastive Learning via Learnable View Generators [PDF]
[MICCAI'22] MUSCLE: Multi-task Self-supervised Continual Learning to Pre-train Deep Models for X-ray Images of Multiple Body Parts [PDF]
[KDD'21] JIZHI: A Fast and Cost-Effective Model-As-A-Service System for Web-Scale Online Inference at Baidu [PDF]
[KDD'21] Structure-aware Interactive Graph Neural Networks for the Prediction of Protein-Ligand Binding Affinity [PDF]
[ICCV'21] Semi-Supervised Active Learning with Temporal Output Discrepancy [PDF]
[ICML'20] RIFLE: Backpropagation in Depth for Deep Transfer Learning through Re-Initializing the Fully-connected LayEr [PDF]
[ICML'20] On the Noisy Gradient Descent that Generalizes as SGD [PDF]
...
Recently, I also wrote some perspectives, tutorials, and reviews with my collaborators.
[Preprint] Converging Paradigms: The Synergy of Symbolic and Connectionist AI in LLM-Empowered Autonomous Agents. arXiv:2407.08516 (2024). [PDF]
[Preprint] When Search Engine Services meet Large Language Models: Visions and Challenges. arXiv:2407.00128 (2024). [PDF]
[Preprint] Towards Explainable Artificial Intelligence (XAI): A Data Mining Perspective. arXiv:2401.04374 (2024). [PDF]
[Preprint] Natural language based context modeling and reasoning with LLMs: A tutorial. arXiv:2309.15074 (2023). [PDF]
[HDS'22] Mobile Sensing in the COVID-19 Era: A Review, Science Partner Journal - Health Data Science [PDF]
Mentees
Dr. Jiang Bian
Emeritus Members
Mr. Zhiyuan Wang, Research Intern (2021, Baidu)
now PhD student @ University of Virginia, VA, USA
Dr. Kafeng Wang, Research Intern (2018.08~2019.08, 2020.09~2021.11, Baidu)
now Postdoc @ THU
Dr. Jie Zhang, Postdoctoral Researcher at Peking University funded by PKU-Baidu joint research grant
now Lecturer @ Queen's University Belfast, UK
Mr. Yunchao Zhang, Undergrad student (2016~2018, MST), Research Intern (2018, Baidu)
now SDE @ Google, WA, United States
Dr. Baoxin Zhao, Research Intern (2019, Baidu)
now R&D Engineer @ Alibaba Cloud, Hangzhou, China
Mr. Jie An, Research Intern (2018~2019, Baidu)
now PhD Student @ University of Rochester, NY, United States
Mr. Jingfeng Wu, Research Intern (2018~2019, Baidu)
now Postdoc @ UC Berkeley, CA, United States
Mr. Ruosi Wan, Research Intern (2018~2019, Baidu)
now MPhil Student @ HKUST, Hong Kong, China
Dr. Yongliang Yang, Visiting PhD Student (2016-2017, MST, primarily supervised by Prof. Don Wunsch)
now Associate Professor @ USTB, Beijing, China
Dr. Xiao Zhong, Postdoc (2016~2017, MST co-advised with Dr. Zeyi Sun)
now Visiting Assistant Professor of Statistics @ Louisianna Tech University, LA, United States
Dr. Yu Huang, Graduate Student (2015-2016, UVA, primarily supervised by Dr. Laura Barnes)
now Assistant Professor, Vanderbilt University, Nashville, TN
Dr. Jinghe Zhang, PhD (2015-2016, UVA, primarily supervised by Dr. Laura Barnes)
now Senior Data Scientist @ Apple, TX, United States.
Some personal interests...