B.Bensaid. Analysis and developement of new optimizers in Machine Learning. Available on HAL.Â
B. Bensaid, G. Po\"ette, R. Turpault. Numerical splitting schemes as the cornerstone for mini-batch optimization, Part II. To be sumitted.
B. Bensaid, G. Po\"ette, R. Turpault. Numerical splitting schemes as the cornerstone for mini-batch optimization, Part I. Submitted to Optimization Letters (March 2025).
B.Bensaid. Complexities of Armijo-like algorithms in Deep Learning context. Submitted to Journal of Optimization Theory and Applications (under review since November 2024, on Arxiv).
B. Bensaid, G. Poette, R. Turpault. Convergence of the Iterates for Momentum and RMSProp for Local Smooth Functions: Adaptation is the Key. Submitted to Optimization Methods and Software (under review since September 2024, on Arxiv).
B. Bensaid, G. Poette, R. Turpault. An Abstract Lyapunov Control Optimizer: Local Stabilization and Global Convergence. Submitted to Optimization Methods and Software (under review since December 2024, on Arxiv).
B. Bensaid, G. Poette, R. Turpault. Lyapunov Deterministic Optimization for Neural Networks Training. Submitted to Journal of Scientific Computing.
B. Bensaid, G. Poette, R. Turpault. Deterministic neural networks optimisation: from a continuous and energy point of view. Published in Journal of Scientific Computing (2023).
Fully connected neural networks implemented in C++ from scratch with DL optimizers and control of numerical errors using the Shaman library developed by Nestor Demeure: https://github.com/bbensaid30/COptimizers.git
Personal optimizers in Tensorflow: https://github.com/bbensaid30/TOptimizers.git
Real datasets with pretreatment: https://github.com/bbensaid30/ML_data.git