Zheng XU 许正

xuzhustc AT gmail  [Curriculum Vitae][Google Scholar][GitHub]

 Zheng's portrait

Welcome to Zheng's site! I am a research scientist at Google working on federated learning. I got my PhD in the department of Computer Science at the University of Maryland, College Park, under the supervision of Dr.
 Tom Goldstein.

My recent interests are: adversarial attack and defense, adaptive stepsize in optimization, primal-dual method in optimization, neural network acceleration, generative adversarial networks for vision and language, and distributed optimization.

I am also interested in some topics I have previously worked with: low rank and sparse models, domain adaptation in computer vision, topic models and tensor methods, 
NER and information extraction, image retrieval, and mining information from large-scale multimedia data.  

Before joining Maryland, I was a Project Officer at Visual Computing Research Group, School of Computer EngineeringNanyang Technological University, Singapore.
 I received my M.Eng. and B.Eng. in the Department of Electronic Engineering and Information Science, the University of Science and Technology of China (USTC), advised by Dr. Chang Wen Chen, Dr. Bin Liu and Dr. Houqiang Li. During my study as a master student, I was fortunate to take a long-term internship  at  Microsoft Research Asia (MSRA) with Dr. Xin-Jing Wang.

I have interned/collaborated with companies: Apple, Adobe, Honda, Amazon, IBM, Rolls-Royce and Microsoft.


News

  • Jan 26, released the ADMM code for part of my thesis ''Alternating Optimization: Constrained Problems, Adversarial Networks, and Robust Models''. The codes for adaptive relaxed (ARADMM, CVPR'17), adaptive consensus ADMM (ACADMM, ICML'17) and low-rank least squares for visual subcategories (BMVC'15) have been previously released on this webpage. The codes for adaptive ADMM (AADMM, AISTATS'17), AADMM for nonconvex problems (NeurIPS workshop'16), and adaptive multi-block ADMM (thesis, chapter 5.1) are included in this package. We also provide implementation for baseline methods, vanilla ADMM, Fast (Nestrov) ADMMresidual balancing, and normalized residual balancing (thesis, chapter 5.1)
  • Dec 15, check out our draft on Advances and Open Problems in Federated Learning.
  • Dec 9 - 14, travel to NeurIPS at Vancouver. Will help with the "Adversarial Training for Free" poster in Tuesday morning, at Google booth in the coffee break of Wednesday afternoon, and at the federated learning workshop in Friday. 
  • Nov 19, "Universal Adversarial Training" has been accepted to AAAI 2020 (acceptance rate 20.6%), congrats Ali, Mahyar, and thank all coauthors
  • Sep 4, "Adversarial Training for Free" has been accepted to NeurIPS 2019  (acceptance rate 21.2%), congrats Ali and all coauthors.
  • Jun 28, defended my dissertation. Many thanks to my committee members, collaborators, and audience today. Will join Google as a Research Scientist working on federated learning in September.
  • Jun 17 - 18, attend Google Federated Learning workshop at Seattle.
  • Mar 30, my intern work at Adobe on style transfer has been accepted to Expressive 2019. The code is here.  
  • Jan 31, release matlab code for an old project on Tensor Decomposition.  
  • Older news