Paper accepted at ICMLA (Dec 2019)
Designed a continuation method for the non-convex optimization of Deep Neural Networks. Result our method (NPACS: Natural Parameter Adaption Continuation with Secant) converged faster & better generalization error compared to ADAM.
Jan 2019
Homotopy Activation functions. Start Simple! We do homotopy from Linear to Non-linear activation. So that you can systematically track the non-convex behavior of deep learning objective functions.
Published at WPI Thesis Link
Oct 2018
Using Machine Learning to generate more training data points given the support of data distribution to accelerate training of Neural Networks. Result 5x speed up in GANs and better generative quality than Batch Normalization.
Published at WPI Thesis Link
Aug 2018
Data Curriculum for GANs. We designed a Curriculum which allows GANs to converge faster than WGAN and better (less biased) than Batch Normalization.
May 2018
Replaced a slow and costly human moderation system with Deep Learning developed solutions.
June 2019
Using Attentional Generative Adversarial Networks to scale images to 4k.
Jan 2019
Submitted to KDD Applied track 2020
Multilingual user intent understanding via Text content. We classify, perform NER and also employ Learning to rank frameworks to coordinate with our business needs.
Jan 2020