Plenary Speakers
ARKADI NEMIROVSKI
John Hunter Chair and Professor
School of Industrial Engineering, Georgia Institute of Technology
Email: arkadi.nemirovski@isye.gatech.edu
Arkadi Nemirovski | ISyE | Georgia Institute of Technology | Atlanta, GA (gatech.edu)
Biography
Arkadi Nemirovski received his M.Sc. (1970) and Ph.D. (1974) degrees in Mathematics from Moscow State University. Since 2005, he is a Professor at the H. Milton Stewart School of Industrial and Systems Engineering of the Georgia Institute of Technology, Atlanta, Georgia, USA; in 1973-1993, he held positions at research institutes in Moscow, Russia, and in 1993-2005, was Professor at the Faculty of Industrial Engineering and Management, Technion, Israel.
Professor Nemirovski's research interests include convex optimization and its applications, primarily in Statistics and Engineering. He has been working on various aspects of continuous optimization: complexity, numerical methods, stochastic optimization, and non-parametric statistics. He is an author of five monographs and over 150 journal papers.
For his contributions to the field of optimization and related areas, Dr. Nemirovski was awarded the 1982 Fulkerson Prize (with L. Khaciyan and D. Yudin) of the Mathematical Programming Society and the AMS, the 1991 Dantzig Prize (with M. Grotschel) of MPS and SIAM, the 2003 John von Neumann Theory Prize (with M. Todd) of INFORMS, and the 2019 Norbert Wiener Prize in Applied Mathematics (with M. Berger) of AMS and SIAM. He is a member of the National Academy of Engineering (since 2017), the American Academy of Arts and Sciences (since 2018), and the National Academy of Sciences (2020).
Talk
Title: Tight Semidefinite Relaxation and Applications
Abstract: The talk focuses on tight semidefinite relaxation bounds on the maxima of quadratic forms over centrally symmetric convex solids, the line of research going back to A. Nemirovski, C. Roos, T. Terlaky, On maximization of quadratic form over intersection of ellipsoids with common center, Mathematical Programming 86 (1999). We outline essential recent progress on the subject and applications of the resulting machinery to (a) near-optimal recovery of signals from indirect observations affected by Gaussian noise, (b) tight approximations of NP-hard affinely adjustable robust counterparts of uncertain linear programs and robust counterparts of problems with uncertain Least Squares constraints, and (c) synthesis of linear controllers for discrete time linear dynamical systems.
The talk is based on joint research with Prof. Anatoli Juditsky, University Grenoble-Alpes, France.
YURII NESTEROV
Professor
Center for Operations Research and Econometrics (CORE) and
Engineering department (INMA)
Université Catholique de Louvain, Belgium
Email: Yurii.Nesterov@uclouvain.be
Biography
Professor Nesterov was born 1956, in Moscow. Master degree 1977, Moscow State University. Doctor degree 1984. Professor at Center for Operations Research and Econometrics, UC Louvain, Belgium. Author of 5 monographs and more than 120 refereed papers in the leading optimization journals. International recognition: Dantzig Prize 2000, John von Neumann Theory Prize 2009, Charles Broyden prize 2010, Francqui Chair (Liege University 2011-2012), SIAM Outstanding paper award (2014), EURO Gold Medal 2016.
Main research direction is the development of efficient numerical methods for convex and nonconvex optimization problems supported by the global complexity analysis: general interior-point methods (theory of self-concordant functions), fast gradient methods (smoothing technique), global complexity analysis of second-order and tensor schemes (cubic regularization of the Newton’s method), accelerated proximal-point methods.
Talk
Title: Acceleration abilities of path-following schemes
Abstract: In this talk we show that the standard short-step interior-point path-following methods do have acceleration abilities, which make them provably faster than the damped Newton Scheme.
This is a joint work with Pavel Dvurechensky,
Biography
Kees Roos (1941) held a chair on Optimization Technology at Delft University of
Technology until 2006, when he retired. From 1998 to 2002 he was a part-time professor at
Leiden University. The past decades his research concentrated on interior-point methods for linear and convex optimization. He is a (co-)author of several books and more than hundred papers in refereed journals. He is (or was) member of the editorial board of several journals, among them the SIAM Journal on Optimization. He was secretary/treasurer of the SIAM
Activity Group on Optimization. He supervised a large number of research projects, among them the Dutch nationwide NWO-project High Performance Methods for Mathematical Optimization, and the project Discrete Mathematics and Optimization of the (Dutch) Stieltjes Institute. In 2011 he shared the Khachiyan prize of the INFORMS Optimization Society with Jean-Philippe Vial (University of Geneva). He was involved in the project “Optimal safety of the dikes in the Netherlands” that received the Franz Edelman award 2013 of INFORMS.
Talk
Title: A new road to polynomial methods for Linear Optimization
Abstract: We introduce a new variant of Chubanov’s method for solving homogeneous linear systems with positive variables. In the Basic Procedure we use a recently introduced cut incombination with Nemirovski’s Mirror-Prox method. We show that generating the new cut requires at most O(n^3) time, just as Chubanov’s cut; despite this, the new cut is always at least as sharp as the one of Chubanov, as we also show. Our Modified Main Algorithm is in essence the same as Chubanov’s Main Algorithm, except that it uses the new Basic Procedure as a subroutine. The new method has O(n^{4.5} L) time complexity. We show that a simplified version of the new Basic Procedure competes well with the Smooth Perceptron Scheme of Pe ̃na and Soheili and, when combined with Rescaling, also with two commercial codes for linear optimization, namely Gurobi and Mosek.
TAMÁS TERLAKY
George N. and Soteria Kledaras '87 Endowed Chair and Professor
Department of Industrial and Systems Engineering
Lehigh University
Email: terlaky@lehigh.edu
Web site: http://www.lehigh.edu/~tat208
Biography
Dr. Terlaky is a George N. and Soteria Kledaras ’87 Endowed Chair Professor Department of Industrial and Systems Engineering, Lehigh University, and Director of the Quantum Computing Optimization Laboratory. He was also the Chair of the Department of Industrial and Systems Engineering in 2008 – 2017, and the Founding director of the School of Computational Engineering and Science 2004-2008 at McMaster University, Canada.
Dr. Terlaky has published four books, edited over ten books and journal special issues and published over 200 research papers. Topics include theoretical and algorithmic foundations of operations research (e.g., invention of the criss-cross method), design and analysis of large classes of interior point methods, computational optimization, worst case examples of the central path, nuclear reactor core reloading optimization, oil refinery and VLSI design optimization, robust radiation therapy treatment optimization, and inmate assignment optimization.
His research interest includes high performance optimization methods, optimization modeling, optimization problems in engineering sciences and service systems, and quantum computing optimzation.
Dr. Terlaky is Founding Editor-in-Chief of the journal, Optimization and Engineering. He has served as associate editor of ten journals and has served as conference chair, conference organizer, and distinguished invited speaker at conferences all over the world. He was general Chair of the INFORMS 2015 Annual Meeting, a former Chair of INFORMS’ Optimization Society, Chair of the ICCOPT Steering Committee of the Mathematical Optimization Society, Chair of the SIAM AG Optimization. He received the MITACS Mentorship Award; Award of Merit of CORS, Egerváry Award of HORS, OISS Award of IISE. He is Fellow of INFORMS, SIAM, The Fields Institute, and elected Fellow of the Canadian Academy of Engineering.
Talk
Title: Quantum Interior Point Methods for Linear Optimization
Abstract: Exploring the opportunities offered by quantum computing to speed up the solution of hard optimization problems is a hot research area. With the goal to speed-up Interior Point Methods (IPMs) for Linear Optimization (LO), Quantum Linear System Algorithms (QLSAs) are applied to solve the Newton systems in IPMs. Since QLSAs inherently produce inexact solutions, we need to use inexact infeasible variants of quantum IPMs (II-QIPMs) because inexactness leads to infeasibility. First we analyze the complexity of II-QIPMs, Then, we introduce a novel "Quantum inspired" Inexact-Feasible QIPM (IF-QIPM) based on a novel form of the Newton System that produce inexact but feasible steps. We show that this method, as the best exact clasic IPMs, enjoys an O(sqrt{n} L) iteration complexity. After analyzing the total time complexity of the proposed method, we also discuss how QLSAs can be used to solve the novel system efficiently in an Iterative Refinement scheme. The IF-QIPM is implemented with both classical and quantum solvers to investigate its efficiency empirically.
STEPHEN WRIGHT
George B. Dantzig Professor, Sheldon Lubar Chair, and the Amar and Balinder Sohi Professor of Computer Science
Computer Science Department
University of Wisconsin-Madison
Email: swright@cs.wisc.edu
Web site: http://pages.cs.wisc.edu/~swright/
Biography
Stephen J. Wright holds the George B. Dantzig Professorship, the Sheldon Lubar Chair, and the Amar and Balinder Sohi Professorship of Computer Sciences at the University of Wisconsin-Madison. His research is in computational optimization and its applications to data science and many other areas of science and engineering. Prior to joining UW-Madison in 2001, Wright held positions at North Carolina State University (1986-90) and Argonne National Laboratory (1990-2001). He has served as Chair of the Mathematical Optimization Society (2007-2010) and as a Trustee of SIAM for the maximum three terms (2005-2014). He is a Fellow of SIAM. In 2014, he won the W.R.G. Baker Award from IEEE for best paper in an IEEE archival publication during the three years 2009-2011. He was awarded the Khachiyan Prize by the INFORMS Optimization Society in 2020 for lifetime achievements in optimization, and received the NeurIPS Test of Time Award in 2020 for a paper published in that conference in 2011.
Prof. Wright is the author / coauthor of widely used text and reference books in optimization including "Primal Dual Interior-Point Methods" and "Numerical Optimization". He has published widely on optimization theory, algorithms, software, and applications.
Prof. Wright served from 2014-2019 as Editor-in-Chief of the SIAM Journal on Optimization and previously served as Editor-in-Chief of Mathematical Programming Series B. He has also served as Associate Editor of Mathematical Programming Series A, SIAM Review, SIAM Journal on Scientific Computing, and several other journals and book series.
Talk
Title: The Role of Complexity Bounds in Optimization
Abstract: Complexity analysis in optimization seeks upper bounds on the amount of work required to find approximate solutions of problems in a given class with a given algorithm, and also lower bounds, usually in the form of a worst-case example from a given problem class as regards the work required by a particular class of algorithms. The relationship between theoretical complexity bounds and practical performance of algorithms on “typical” problems varies widely across problem and algorithm classes, and relative interest among researchers between these two aspects of algorithm design and analysis has waxed and waned over the years. This talk surveys complexity analysis and its relationship to practice in optimization, with an emphasis on linear programming and convex and nonconvex nonlinear optimization, providing historical (and cultural) perspectives on research in these areas.