I am a senior research technologist in Memory Technology, Western Digital R&D Engineering since 2021. I was a research scientist in Computing Technology Lab in Alibaba Research from 2019 to 2021. I was a principal researcher in HGST Research from 2014 to 2019. I received my Ph.D. in Electrical Engineering in Sept. 2014 from the Department of Electrical and Computer Engineering, UCSD, where I was also associated with the Center for Memory and Recording Research (CMRR) from 2010 to 2014. My Ph.D. advisor was Prof. Siegel. I entered Department of Electronic Engineering, Tsinghua University, Beijing, China in Sept. 2005 and graduated in June 2009.
My research focus is on algorithm and architecture co-design. In AI, I developed deep learning acceleration to jointly optimize deep learning algorithms and hardware architectures. Specifically for LLM, I designed memory architecture that enables faster and noise tolerant inference. I am also interested in the robustness of deep neural networks against errors from media and computing, training neural networks with the presence of noise, etc. In Information Theory, I am interested in coding, signal processing, and system architectural design for non-volatile memories, encoding and decoding of BCH, LDPC and polar codes, and coding for distributed storage. My personal interests also includes math puzzles and games.
News
An efficient and accurate dynamic sparse training framework based on parameter freezing, accepted by Association for the Advancement of Artificial Intelligence (AAAI) 2025