Home
I am a senior technologist in Next Generation Platform Technologies, Western Digital Research from 2014 to 2019 and from 2021 to present. I was a research scientist in Computing Technology Lab in Alibaba Research from 2019 to 2021. I received my Ph.D. in Electrical Engineering in Sept. 2014 from the Department of Electrical and Computer Engineering, UCSD, where I was also associated with the Center for Memory and Recording Research (CMRR) from 2010 to 2014. My Ph.D. advisor was Prof. Siegel. I entered Department of Electronic Engineering, Tsinghua University, Beijing, China in Sept. 2005 and graduated in June 2009.
My research focus is on algorithm and architecture co-design. In AI, I developed deep learning acceleration to jointly optimize deep learning algorithms and hardware architectures. I am also interested in the robustness of deep neural networks against errors from media and computing, training neural networks with the presence of noise, etc. In Information Theory, I am interested in coding, signal processing, and system architectural design for non-volatile memories, encoding and decoding of BCH, LDPC and polar codes, and coding for distributed storage. My personal interests also includes math puzzles and games.
News
Manuscripts "Advancing dynamic sparse training by exploring optimization opportunities" accepted by ICML 2024
Manuscripts "NeurRev: Train better sparse neural network practically via neuron revitalization" accepted by ICLR 2024
Manuscripts "DISCO: Distributed inference with sparse communications" accepted by WACV 2024