Bio:
I received a doctoral degree in Mathematical Optimization from Kyoto University, where I had the privilege of studying under Prof. Nobuo Yamashita and worked on utilizing second-order information for large-scale optimization problems. During my PhD program, I also served as a research assistant under Prof. Makoto Yamada at Kyoto University (currently at OIST) where I have worked on randomized Hessian approximation for high-dimensional feature space. One of the core collaborator is Prof. Dinesh Singh. Prior to my doctoral studies, I completed my master's degree under the guidance of Prof. Narasimha Kumar at the Indian Institute of Technology, Hyderabad (IIT-H).
Recently, I was working at Indian Institute of Technology, Mandi as research associate for a short term and worked on the optimization methods for deep learning under the federated settings, under Prof. Dinesh Singh.
Email: hardiktankaria1406@gmail.com
Github portfolio: https://hardy-opt.github.io/
PhD graduate in Machine Learning and Optimization with 6+ years of research experience at Kyoto University.
Specialized in scalable optimization algorithms, predictive analytics, and large-scale, high-dimensional data.
Strong record of publications and hands-on expertise in improving computational efficiency. Proven leadership
through international presentations and cross-cultural team management. Looking for data scientist or ML
engineer position with a focus on credit scoring, and consumer risk assessment to apply advanced data
science and ML techniques to solve real-world problems.
Research Interest: Convex & Nonconvex Optimization, Credit Modeling, Financial Data Analysis, Large-scale Optimization, Optimization for Machine learning and Science, and Randomized algorithms
Credit Card Default Prediction Model
Credit Score Prediction for Loans
Stochastic Variance Reduced Gradient with Barzilai-Borwein Method
Scalable Optimizer for Empirical Risk Minimization
H. Tankaria, S. Sugimoto, N. Yamashita,
Computational Optimization and Applications 82, 61-88 (2022). DOI: 10.1007/s10589-022-00351-5 .
H. Tankaria and N. Yamashita,
Journal of Industrial and Management Optimization 2024, 20(2): 525-547. DOI: 10.3934/jimo.2023089
Hardik Tankaria, Dinesh Singh and Makoto Yamada,
Nys-Curve: Nyström - Approximated Curvature for Stochastic Optimization.
Arxiv preprint is available on : https://arxiv.org/abs/2110.08577
Hardik Tankaria,
Optimization Regularized Nyström method for Large-Scale Unconstrained Non-Convex Optimization.
Preprint is available on: https://www.researchgate.net/publication/380000293
Kioxia Coroporation: One year research project on regularized L-BFGS for large-scale unconstrained optimization with dimensionality reduction technique.
Secured funding of 1 million Japanese Yen
Completed one-year project on stochastic regularized L-BFGS method with dimensionality reduction technique. I developed an algorithm that significantly reduced computational complexity (d times) while maintaining accuracy, enhancing the performance for large-scale datasets.
Hardik Tankaria and Nobuo Yamashita,
Reducing Variance of Stochastic Gradient using Barzilai-Borwein method as second-order information
20th Joint research meeting of the Japan Society for Industrial and Applied Mathematics (JSIAM), Nagaoka Institute of Technology, Niigata, Japan - February 2024
Hardik Tankaria and Nobuo Yamashita,
A Stochastic Variance Reduced Gradient using Second Order Information
10th International Congress on Industrial and Applied Mathematics (ICIAM), Waseda University, Tokyo, Japan - Aug, 2023.
Hardik Tankaria Dinesh Singh, and Makoto Yamada,
Nys-Newton: Nyström approximated Curvature for Convex Optimization
Information-Based Induction Sciences and Machine Learning (IBISML), Kyoto, Japan - Dec, 2022.
Hardik Tankaria, Nobuo Yamashita, Dinesh Singh, and Makoto Yamada
Nys-LMN: Nyström Levenberg-Marquardt-type Newton method
International Workshop on Continuous Optimization (online), Japan - Dec, 2022.
Hardik Tankaria, Dinesh Singh, and Makoto Yamada,
Nys-Transfer: Nyström approximated Newton-sketch for Fine-tuning the Deep Nets for Brain MRI
International Symposium on Artificial Intelligence and Brain Science 2022, Okinawa Institute of Science and Technology(OIST), Okinawa, Japan - July 2022.
Hardik Tankaria and Nobuo Yamashita,
Accelerated Stochastic Variance Reduced Method with second order Information.
14th ICT Innovation at the Kyoto University, Kyoto, Japan- Feb, 2020.
Hardik Tankaria and Nobuo Yamashita,
Non-monotone Regularized Limited memory BFGS method for Large-Scale Unconstrained Optimization.
6th International Conference on Continuous Optimization (ICCOPT), Berlin, Germany, Aug, 2019.
MCM Scholarship for one year during Masters at Indian Institute of Technology, Hyderabad (2015).
JICA Scholarship from IITH-JICA FRIENDSHIP for ongoing PhD.
Travel grant for summer school and International conference in Germany (2019) from the department of applied mathematics and physics, Kyoto university.
Cleared GSET examination for Assistant professor in Government universities in Gujarat, India (2017). [Criteria - top 6%]
Cleared Joint Admission Test for M.Sc. (JAM) to get admission in Indian Institute of Technology with all India rank 343rd (2014). [Criteria - top 7%]
Cleared All India Engineering Exam (2011).
The Man Who Knew Infinity - Matthew Brown
The Imitation Game - Morten Tyldum
A Beautiful Mind - Ron Howard
Hidden Figures - Theodore Melfi