Currently, I am Machine Learning Scientist-3 at Expedia, I research and develop end-to-end machine learning systems with significant business impact. Here, I have been working on a variety of Ranking, NLP, and Vision projects to thrive the business requirements. Currently, I am leading the project to recommend activities at Expedia.com and communicate deliverables with product stakeholders. Also, I am a Ph.D. Candidate at WPI, yes it's part-time and self-funded. I am doing my research on continuation methods for Deep Learning Optimization (a.k.a Curriculum Learning) which keeps me highly motivated as I am come closer to the theoretical understanding of deep learning. I have worked on a diverse set of applications in computer vision for example, GANs, Image classification, compression and object detection. Also, in NLP I have done industry projects of one plus year length. These include Text classification, Named Entity Recognition, Text similarity and Learning to rank frameworks. Previously, I have won few national-level competitions in fields such as Mobile App development, IOT & Data analytics, and web development.
I am open to collaboration and if you find my work influential to your progress let me know if you want to contribute or fund my Ph.D. :) (harshnpathak@gmail.com)
Research Projects
[In Progress] Extending Continuation Methods on Complex Plan for Neural Networks
Image credits Bertini ; PhD Thesis: Study of Dynamical systems on Complex plane to get rid of singularites
Present
Bifurcation Analysis of Neural Network
PhD Thesis: To study the dynamical properties of the neural network training/optimization using continuation methods and bifurcation analysis. On the left is the classic example of bifurcating solutions of a logistic model.
Present
Principled Curriculum Learning using Paramter Continuation Methods
ICML 2021 Workshop on Beyond First ordrer methods in ML
Creator of Continuation-JAX an adaptive curriculum libraray
May 2021
Model Curriculum in Teacher Student Curriculum Learning
Dec 2020
NPACS: Optimizer for Deep Learning
Paper accepted at ICMLA (2019)
Extended Chapter on Springer (2020)
Designed a continuation method for the non-convex optimization of Deep Neural Networks. Result our method (NPACS: Natural Parameter Adaption Continuation with Secant) converged faster & better generalization error compared to ADAM for Auto-encoders.
Sep 2020
Continuation Activations
Homotopy Activation functions. Start Simple! We do homotopy from Linear to Non-linear activation. So that you can systematically track the non-convex behavior of deep learning objective functions.
Published at WPI Thesis Link
Oct 2018
Continuation (C)-SMOTE
Using Machine Learning to generate more training data points given the support of data distribution to accelerate training of Neural Networks. Result 5x speed up in GANs and better generative quality than Batch Normalization.
Published at WPI Thesis Link
Aug 2018
Step-Up GAN
Data Curriculum for GANs. We designed a Curriculum which allows GANs to converge faster than WGAN and better (less biased) than Batch Normalization.
May 2018
Industry Projects
Image Moderation at Scale
Replaced a slow and costly human moderation system with Deep Learning developed solutions.
June 2019
Attention GAN for Super-Resolution
Paper: IEEE BigData (Dec 2018)
Blog: Medium Blog
Using Attentional Generative Adversarial Networks to scale images to 4k.
Jan 2019
Multilingual NLU Systems Development and Deployment
Accepted at IEEE BigData 2020 [Video]
Multilingual user intent understanding via Text content. We classify, perform NER and also employ Learning to rank frameworks to coordinate with our business needs.
Jan 2020