I’ve decided to read and summarize one great paper a week (or month…) from this list. This week’s paper is the 1981 paper The ellipsoid method and its consequences in combinatorial optimization by Martin Grötschel, László Lovász, and Alexander Schrijver. Also see the corrigendum. Jeff Linderoth gives a nice overview in his lecture slides.
Of all the papers on this list, this may be my favorite. Its profoundness is staggering—illustrating that “separation = optimization.” It noted that linear programs with exponentially many constraints can be solved in polynomial time. All that is needed is the ability to solve the separation problem in polytime. The separation problem asks: Given , does satisfy all constraints? If not, return a valid inequality that violates. What if the linear program has exponentially many variables, but polynomially many constraints? Take the dual!
This approach gives a polynomial time algorithm to optimize over the subtour elimination polytope for TSP, even though there are exponentially many constraints. Jeff Linderoth, in his lecture slides, claims that this is the only known polytime algorithm for this task. Michael Trick confirms this here. (Edit: this is actually not true.)
Unfortunately, the ellipsoid method has practical limitations. But, as the authors state:
The main point is that the ellipsoid method proves the polynomial solvability of a large number of different combinatorial optimization problems at once, and hereby points out directions for the search for practically feasible polynomial algorithms.
We usually think of linear programming as the problem of optimizing a linear objective over a polyhedron. This turns out to be equivalent to the feasibility problem—just find a point within the polyhedron—which is how the ellipsoid method is usually described. We assume that the polyhedron is bounded, and encapsulate it with an ellipsoid . Subsequently, test if the center of the ellipsoid is feasible, i.e., . If the point is feasible, then we are done. Otherwise, find a valid inequality that violates. Push the inequality until it reaches . Now we encapsulate the half-ellipsoid with a new ellipsoid . Then, repeat this process, generating a sequence of smaller and smaller ellipsoids. Either we find a feasible point along the way, or the ellipsoids get so small that we declare that (within tolerance).
— Austin Buchanan (@austinisi) June 25, 2013
The following theorem is proven for general convex bodies, and then fine-tuned for bounded rational polyhedra.
Let be a class of convex bodies. There is a polynomial algorithm to solve the separation problem for the members of , if and only if there is a polynomial algorithm to solve the optimization problem for the members of .
In subsequent sections, they note consequences for matroid intersection, perfect matchings, -matchings, branchings, and for minimizing submodular functions. They then apply their knowledge for non-polytopal convex sets, developing polynomial algorithms for problems in perfect graphs. The ellipsoid method, along with Lovász theta function, provide polytime algorithms for max-weight-clique, max-weight-independent-set, min-weight-coloring, and min-weight-clique-cover in perfect graphs.
Finally, they use the equivalence of separation and optimization to prove some negative results—that some problems are NP-hard. When I first read this paper several months ago, I was extremely confused when I read that
…we note that the vertex-packing problem of a graph is in a sense equivalent to the fractional chromatic number problem, and comment on the phenomenon that this latter problem is an example of a problem in NP which is NP-hard but (as for now) not known to be NP-complete.
Next up: C. Papadimitriou and M. Yannakakis, The Complexity of Facets (and Some Facets of Complexity), Journal of Computer and System Sciences 28 (1984), 244.