Program

May 18, 2020. All times are displayed in EDT time zone.

The full program, including talks abstracts and posters titles, is available as a clickable PDF file.

12:00pm-12:15pm: Opening Remarks

12:15pm-12:45pm: Keynote talk by Ivana Ljubic

Title: Casting Light on the Hidden Bilevel Combinatorial Structure of the Capacitated Vertex Separator Problem

Abstract: Given an undirected graph, we study the capacitated vertex separator problem which asks to find a subset of vertices of minimum cardinality, the removal of which induces a graph having a bounded number of pairwise disconnected shores (subsets of vertices) of limited cardinality. The problem is of great importance in the analysis and protection of communication/social networks against possible viral attacks, and for matrix decomposition algorithms.

We provide a new bilevel interpretation of the problem, and model it as a two-player Stackelberg game, in which the leader interdicts the vertices (i.e., decides on the subset of vertices to remove), and the follower solves a combinatorial optimization problem on the resulting graph. This approach allows us to develop a computational framework based on an integer programming formulation in the natural space of the variables. Thanks to this bilevel interpretation, we derive three different families of strengthening inequalities and show that they can be separated in polynomial time. We also show how to extend these results to a min-max version of the problem.

Our extensive computational study conducted on available benchmark instances from the literature reveals that our new exact method is competitive against the state-of-the-art algorithms, and is able to improve the best known results for several difficult classes of instances.

This is a joint work with F. Furini, E. Malaguti, P. Paronuzzi

1:00pm-2:45pm: Poster Presentations

There will be 5 Zoom meeting rooms for parallel talks streams:

  • (S1) APPLICATIONS: Zoom ID 910 0415 2796
  • (S2) MACHINE LEARNING: Zoom ID 992 8999 6850
  • (S3) COMBINATORIAL OPTIMIZATION: Zoom ID 920 9689 8644
  • (S4) STOCHASTIC OPTIMIZATION: Zoom ID 967 4902 1631
  • (S5) THEORY OF MIP: Zoom ID 943 6589 6559

Schedule:

  • 1:00pm-1:15pm: (S1) Carolina Riascos; (S2) Yunhao Tang; (S3) Kartik Kulkarni; (S4) Rui Chen; (S5) Ryan Cory-Wright;
  • 1:15pm-1:30pm: (S1) Jamie Fravel; (S2) Yongchun Li; (S3) Christopher Muir; (S4) Joshua Gunter; (S5) Hongyi Jiang;
  • 1:30pm-1:45pm: (S1) Hamidreza Validi; (S2) Michael Li; (S3) Gabriele Dragotto; (S4) Shixuan Zhang; (S5) Silvia Di Gregorio;
  • 1:45pm-2:00pm: (S1) Mario Arrieta-Prieto; (S2) Sina Aghaei; (S3) Xuan Zhang; (S4) Margarita Castro; (S5) Yatharth Dubey;
  • 2:00pm-2:15pm: (S1) Joshua Margolis; (S2) Jongeun Kim; (S3) Qimeng Yu; (S4) Suresh Bolusani; (S5) Shengding Sun;
  • 2:15pm-2:30pm: (S1) Akhilesh Soni; (S2) Aaron Ferber; (S3) Hassan Mortagy; (S4) Petros Papadopoulos; (S5) Endric Daues;
  • 2:30pm-2:45pm: (S1) Yijiang Li; (S2) Haoyue Wang.

3:00pm-3:30pm: Keynote talk by Sebastian Pokutta

Title: Beyond Worst-case Rates: Data-dependent Rates in Learning and Optimization

Abstract: Worst-case complexity bounds are increasingly insufficient to explain the (often superior) real-world performance of optimization and learning algorithms. We will consider data-dependent rates, approximation guarantees, and complexity bounds to provide guarantees much more in line with actual performance. We exemplify this effect through three examples, where data-dependency can provide finer complexity bounds.

3:30pm-3:45pm: Poster Prize and Closing Remarks

3:45pm-∞: Virtual social gathering on Slack Workspace