The Program
Program Outline
Sunday, September 10
18:00-20:00 Welcome reception at Alexander Hotel's restaurant on the 7th Floor
Monday, September 11
09:30-10:00 Aharon Ben-Tal
An algorithm for maximizing a convex function based on its minimum
10:00-10:30 Yair Carmon
DoG is SGD’s best friend: toward tuning-free stochastic optimization
10:30-11:10 Coffee break
11:10-11:40 Lieven Vandenberghe
Bregman proximal methods for conic optimization
11:40-12:10 Russell Luke
Quadratically convergent, balanced superficial sampling for minimizing max functions
12:10-13:40 Lunch
13:40-14:10 Yair Censor
Superiorization: The asymmetric roles of feasibility-seeking and objective function reduction
14:10-14:40 Xiaoming Yuan
The balanced Augmented Lagrangian method for convex programming
14:40-15:10 Coffee break
15:10-15:40 Mustafa Celebi Pinar
On sparsity regularized and sparsity constrained problems
15:40-16:10 Xin Liu
Constraint dissolving approaches for a class of Riemannian optimization problem
18:00-21:00 Conference Dinner Celebrating Marc’s Birthday at Beit Kandinof, a transport will depart directly from the conference site (departure expected at 16:45)
Tuesday, September 12
09:30-10:00 Yonina Eldar
Model based deep learning: applications to imaging and communications
10:00-10:30 Michael P. Friedlander
Polar deconvolution of mixed signals
10:30-11:00 Coffee break
11:00-11:30 Luis Nunes Vicente
Sample sizing for function estimation In stochastic derivative-free optimization
11:30-12:00 Haim Avron
Solving trust region subproblems using Riemannian optimization
12:00-12:30 Kfir Yehuda Levy
Do stochastic feel noiseless: stable optimization via a double momentum mechanism
12:30-14:00 Lunch
15:00 Conference Tour in Jaffa (Yafo) including a meal in an eastern restaurant
Wednesday, September 13
09:30-10:00 Alfredo Noel Iusem
A finitely convergent circumcenter method for the convex feasibility problem
10:00-10:30 Defeng Sun
An Efficient HPR Algorithm for the Wasserstein Barycenter Problem with $O(Dim(P)/\varepsilon)$ Computational Complexity
10:30-11:00 Coffee break
11:00-11:30 Jalal Fadili
A stochastic Bregman primal-dual splitting algorithm for composite optimization
11:30-12:00 Radu Ioan Bot
Fast continuous and discrete time approaches for monotone equations
12:00-12:30 Tomer Koren
Exploring the generalization ability of convex optimization algorithms
12:30-13:30 Lunch
13:30-14:00 Simeon Reich
Comparing the methods of alternating and simultaneous projections for two subspaces
14:00-14:30 Edouard Pauwels
On sequential convergence of Bregman type methods
14:30-15:00 Dan Garber
Efficiency of gradient methods for low-rank recovery under strict complementarity
15:00-15:30 Coffee break
15:30-16:00 Shimrit Shtern
First-order methods for structured bi-level optimization
16:00-16:30 Aaron Melman
Polynomial eigenvalue inclusion regions from scalar polynomial properties