Workshop on Seeking Feasibility in Combinatorial Problems

This workshop is about General Principles in Seeking Feasible Solutions for Combinatorial Problems.

Seeking feasible solutions quickly is a common theme in combinatorial problems of many types: mixed-integer linear and nonlinear programming, constraint programming, constraint satisfaction, etc.  While there is a wealth of information on specific algorithms and techniques for specific problems, there has been less emphasis on general principles that cross the boundaries between problem types.  For example, it has recently been observed that (i) in mixed-integer linear programming constraint feasibility is always maintained and branching is required to achieve integrality, (ii) constraint programming uses the opposite approach: integrality is always maintained and branching is required to achieve constraint feasibility, and (iii) feasibility pumping techniques alternate between satisfying the constraints and satisfying the integrality conditions. The goal of the workshop is to explore these and other general principles of feasibility seeking in combinatorial problems with the hope that this will lead to insights resulting in even more powerful methods.

This workshop is part of the CPAIOR 2013 Conference, taking place in Yorktown Heights, NY, USA May 18-22, 2013.


John Chinneck, Carleton University (
Louis-Martin Rousseau, Polytechnique Montreal (
Gilles Pesant, Polytechnique Montreal (


8:30 - 8:35 Welcome and Opening Remarks (John Chinneck and Gilles Pesant)
8:35 - 9:15 Gilles Pesant: Branching to Force Constraint Satisfaction in CP
9:15 - 10:00 John Chinneck: Branching to Force Variable Value Propagation in MILP

10:00 - 10:30 Coffee Break

10:30 - 11:15 Domenico Salvagnin: Principles of Feasibility Pumping
11:15 - 12:00 Ted Ralphs: The Complexity of Search: What We Can Learn From Games

12:00 - 1:30 Lunch

1:30 - 2:15 John Hooker: A Primal/Dual Framework for Combinatorial Problem Solving
2:15 - 3:00 Rick Wallace: Branching Rules in Three Guises: Information, Actions, and Performance Quality

3:00 - 3:30 Coffee Break

3:30 - 4:15 Ashish Sabharwal : Branching Strategies and Restarts in SAT Solvers
4:15 - 5:00 Panel and Open Discussion

Speakers and Abstracts:

John Chinneck: Branching to Force Variable Value Propagation in MILP

When trying to find an integer-feasible solution quickly in a mixed-integer linear program, intuition says that branching to maximize the probability of reaching a feasible solution is best. This is wrong: the most effective strategies branch in the direction that has the least probability of reaching feasibility! This principle of branching to force change is effective because it forces variable value propagation in the greatest number of integer variables, which quickly forces the LP solution into the integer "corners" of the polytope. Relevant papers: Pryor and ChinneckPatel and Chinneck, workshop presentation.

John Hooker: A Primal/Dual Framework for Combinatorial Problem Solving

Most successful combinatorial solution methods use a primal/dual strategy. Although the primal problem is to find feasible solutions and the dual problem is to prove infeasibility (or optimality), strategies for one tend to be more effective when combined with strategies for the other. This provides a unifying framework for understanding such solution strategies as branching methods, DPLL with clause learning, dynamic backtracking, strong branching, and Benders decomposition. By distinguishing the primal and dual elements of these methods, it suggests techniques for improving both. For example, it sees branching as primarily a dual method, which means that branching must be conducted in a very different way when used to solve the primal problem. The framework can be extended to heuristic methods, which allows exact and heuristic methods to exchange ideas. For example, branching can be viewed as a local search method for solving the dual, and relaxation techniques can accelerate primal heuristics. Workshop presentation.

Gilles Pesant: Branching to Force Constraint Satisfaction in CP

The typical solving process in constraint programming manipulates indeterminate solutions instead of points in the solution space: variables are not currently fixed to a particular value but are restricted to a finite number of integer values (their domain). An indeterminate solution thus represents an integer-feasible region of the solution space that we believe contains points satisfying the other constraints as well. We base such a belief on the assurance that each constraint has some points from that region which satisfy it (and even that each value in a domain belongs to at least one such point). But this doesn't guarantee that there exists a common point, i.e. a feasible solution to the whole problem within that region.     

So we need to search these regions to find a feasible solution. This usually takes the form of a search tree for which we design branching heuristics that break the regions into smaller ones. A successful heuristic should display two seemingly opposite strengths: quickly drive search toward a solution in a region that contains some, and quickly dismiss a region that contains none. We look at some popular branching heuristics under this light. In particular we examine a family of heuristics which evaluate the value distribution of variables among solutions to individual constraints and then use that information to increase the likelihood of branching into a solution-rich region. Relevant materialsWorkshop presentation: "CPBranchingPrinciples.pdf" file listed at bottom of page.

Ted Ralphs: The Complexity of Search: What We Can Learn from Games

It is well-known that for certain classes of NP-hard optimization problems, even the problem of finding feasible solutions can be extremely difficult. The theory of computational complexity pioneered by Turing and developed by the many others who followed is the tool typically used to judge the theoretical difficulty of such problems, but this framework falls short as a tool to assess the performance of the algorithms we use to solve these problems in practice. What makes such algorithms difficult to analyze is that they try to circumvent the enumeration that must occur in the worst case by using heuristic methods to optimize the performance of the algorithm itself. In this talk, we describe the more general complexity framework used to analyze certain classes of extensive-form games and other multilevel problems and then show how it might be useful in thinking about the performance of algorithms for single level problems. Workshop presentation.

Ashish Sabharwal: Branching Strategies and Restarts in SAT Solvers

Since the 1990s, SAT solvers have been designed to employ general, domain-independent search strategies that work remarkably well out-of-the-box on combinatorial problems originating in a wide variety of application domains. Unlike typical CP and MIP solvers, the heuristics used by DPLL-based systematic SAT solvers for variable and value selection are often strongly guided by conflict analysis and the resulting learned clauses. Further, the success of SAT solvers relies heavily on a careful mix of rapid restarts to avoid heavy-tailed runtime behavior, techniques such as phase saving to localize search, and measures such as Literal Block Distance to predict the usefulness of learned information. This talk will discuss some of the key techniques behind successful systematic SAT solvers, focusing in particular on branching and restart strategies that make their search very different from a traditional tree-like exploration. Workshop presentation.

Domenico Salvagnin: Principles of Feasibility Pumping

The Feasibility Pump is a general purpose primal heuristic which iteratively generates two sequences of points that satisfy feasibility in a complementary way. Each point of one sequence is obtained as the "closest" one to the previous point of the other sequence, according to some appropriate distance function. Although originally developed for mixed-integer binary problems only, almost 10 years of active research significantly improved its effectiveness and scope of applicability. In this talk we will review the key unique ingredients of feasibility pumping, and give an overview of the many extensions developed so far. Workshop presentation: "Principles of FP.pdf" file listed at bottom of page.

Rick Wallace: Branching Rules in Three Guises: Information, Actions, and Performance Quality

In this talk we argue that a systematic account of  branching rules (aka variable ordering heuristics) has three distinguishable components: (i) the information used to make choices, (ii) the manner in which a heuristic affects performance (its "heuristic action"), (iii) the quality of performance associated with a heuristic, which can be defined in terms of ideal strategies. To date most attention has been paid to the first component, and this effort has been unsystematic and even scatter-shot in character, although tremendous advances have been made in discovering effective rules.There are at least three reasons for trying to develop a more systematic account of branching rules: (i) it would be useful to be able to design good rules from first principles, (ii) it would be useful to know what limits there are in the improvements we can effect in this fashion, (iii) it would be nice to understand what the hell it is we're doing. In this talk we will take some tentative steps toward these goals. Workshop presentation: "WallaceCPAIOR2013.pdf" file listed at the bottom of this page.

Workshop attendees are invited to participate in a final panel discussion to identify further general principles in seeking feasible solution for combinatorial problems.

John Chinneck,
May 24, 2013, 1:41 PM
John Chinneck,
May 24, 2013, 1:24 PM
John Chinneck,
Jun 10, 2013, 6:30 AM