Research

Current Research Projects

With the emergence of larger and larger neural model architectures and new learning tasks, it becomes harder to deploy these powerful models into real-world applications, and to understand their behavior and potential shortcomings. My research aims to explore the interplay between model architecture, training data, and loss and regularization landscape of neural network models for better efficiency and robustness, while exploring new learning schemes that codesigns learning task (data) and model architecture to achieve more controllable and interpretable deep learning. Here are some ongoing projects of my research:

Compressing emerging neural network architectures and novel learning tasks

Finegrained understanding and improvement of (large) model performance

Learning diverse ensemble model for better generalizability, robustness, and interpretability 

Past Research Projects

I have been working on improving the efficiency and robustness of deep neural networks during my PhD study at Duke University. You can find a brief summary of my previous research projects here.

Besides my main research topics, I also participated in the research of privacy preservation, federated learning, SW/HW codesign of emerging devices architectures, and neural network accelerator architecture design. You can check out my full research specturm in my publication list.

Research Fundings

My research are partially funded by the following companies and grants, where I serve as the key researcher in these projects.