NPACS: Optimizer for Deep Learning

Paper accepted at ICMLA (Dec 2019)

Designed a continuation method for the non-convex optimization of Deep Neural Networks. Result our method (NPACS: Natural Parameter Adaption Continuation with Secant) converged faster & better generalization error compared to ADAM.

Jan 2019

Continuation Activations

Homotopy Activation functions. Start Simple! We do homotopy from Linear to Non-linear activation. So that you can systematically track the non-convex behavior of deep learning objective functions.

Published at WPI Thesis Link


Oct 2018

Continuation (C)-SMOTE

Using Machine Learning to generate more training data points given the support of data distribution to accelerate training of Neural Networks. Result 5x speed up in GANs and better generative quality than Batch Normalization.

Published at WPI Thesis Link

Aug 2018

Step-Up GAN

Data Curriculum for GANs. We designed a Curriculum which allows GANs to converge faster than WGAN and better (less biased) than Batch Normalization.

Project Link1

Project Link2

May 2018

Industry Projects

Image Moderation at Scale

Replaced a slow and costly human moderation system with Deep Learning developed solutions.

Medium Blog

June 2019

Attention GAN for Super-Resolution

Using Attentional Generative Adversarial Networks to scale images to 4k.

Medium Blog

IEEE Paper BigData (Dec 2018)

Jan 2019

Multilingual NLU Systems Development and Deployment

Submitted to KDD Applied track 2020

Multilingual user intent understanding via Text content. We classify, perform NER and also employ Learning to rank frameworks to coordinate with our business needs.

Jan 2020