I am currently a Stefan E. Warschawski Visiting Assistant Professor at UCSD Math department. My research interests lie at the intersections of nonconvex optimization, statistical signal processing, dynamical systems and control as well as decentralized optimization. I have broadly worked on the following research themes:
Online learning proximal point algorithms for centralized and decentralized time-varying optimization
Decentralized optimization under time varying adversarial network attacks
Analysis of gradient descent trajectories around saddle points of Morse functions and sharp exit time estimates of trajectories from such regions
Improved curvature based perturbation of gradient descent in nonconvex Morse functions for faster saddle escape and second order convergence
Dynamics of a class of Nesterov type accelerated gradient methods on smooth nonconvex functions over different state spaces, almost sure saddle escape under momentum, tradeoffs between saddle escape and local minimum convergence for accelerated methods
Volume estimates of the data forging sets arising in machine unlearning algorithms
Rutgers University
Ph.D., Electrical and Computer Engineering, Oct. 2024
Thesis: Gradient Based Methods for Nonconvex Optimization: A Dynamical Systems Viewpoint
Advisors: Prof. Waheed Bajwa, Prof. Mert Gurbuzbalaban
Rutgers University
M.S., Mathematics, Jan. 2023
Advisor: Prof. Yanyan Li
Indian Institute of Technology (IIT), Kanpur
B.Tech., Electrical Engineering, June 2015
University of California, San Diego
S.E.W. Visiting Assistant Professor, July 2024-June 2027
Department of Mathematics
Mentor: Prof. Rayan Saab
Department of Electrical Engineering, IIT Kanpur
Senior Project Associate, Feb 2017-Aug 2018
Supervisor: Prof. Ketan Rajawat
EXL Analytics, Gurugram, India
Consultant II- Product Development (Data Science), Aug 2015-Jan 2017
MATH 20D : Intro to Differential Equations, Fall 2024
MATH 170B : Intro to Numerical Analysis - nonlinear equations and approximations, Winter & Spring 2025, 2026
MATH 170C : Intro to Numerical Analysis - ordinary differential equations, Spring 2025
MATH 180A : Intro to Probability, Fall 2025
You can also visit my Google Scholar page for a complete list.
Accelerated gradient methods for nonconvex optimization: Escape trajectories from strict saddle points and convergence to local minima, R. Dixit, M. Gurbuzbalaban, W.U. Bajwa, accepted in Foundations of Computational Mathematics (FoCM Journal) 2025, 122 pages.
Boundary conditions for linear exit time gradient trajectories around saddle points: Analysis and algorithm, R. Dixit, M. Gurbuzbalaban, W.U. Bajwa, IEEE Transactions on Information Theory 2023, pp. 2556-2602.
Exit time analysis for approximations of gradient descent trajectories around saddle points, R. Dixit, M. Gurbuzbalaban, W.U. Bajwa, Information and Inference, IMA 2023, pp. 714-786.
Online learning over dynamic graphs via distributed proximal gradient algorithm, R. Dixit, A.S. Bedi, K. Rajawat, IEEE Transactions on Automatic Control 2020, pp. 5065-5079.
Online learning with inexact proximal online gradient descent algorithms, R. Dixit, A.S. Bedi, R. Tripathi, K. Rajawat, IEEE Transactions on Signal Processing 2019, pp. 1338-1352.
The Measure of Deception: An Analysis of Data Forging in Machine Unlearning, R. Dixit, Y. Hui, R. Saab, preprint 2025, 50 pages.
RESIST: Resilient Decentralized Learning Using Consensus Gradient Descent, C. Fang*, R. Dixit*, W.U. Bajwa, M. Gurbuzbalaban, preprint 2025 ("*" denotes equal contribution), 100 pages.