The ellipsoid method as a lion hunt in the Sahara

From Grötschel, Lovász, and Schrijver’s Geometric Algorithms and Combinatorial Optimization:

Recall the well-known method of catching a lion in the Sahara. It works as follows. Fence in the Sahara, and split it into two parts; check which part does not contain the lion, fence the other part in, and continue. After a finite number of steps we will have caught the lion — if there was any — because the fenced-in zone will be so small that the lion cannot move anymore. Or we realize that the fenced-in zone is so small that it cannot contain any lion, i.e., there was no lion at all. In order to illustrate the ellipsoid method by this old hunter’s story we have to describe what our Sahara is, how we split it into two parts, how we fence these in, and when we can declare the lion caught or nonexistent.

For a pictorial explanation, see here.

Grötschel, Lovász, and Schrijver weigh in on the “Is TSP NP-complete?” debate

It’s just semantics.

In last month there’s been a tussle regarding whether or not the Traveling Salesman Problem (TSP) is NP-complete. Puget argues that it is not, because he’s thinking of the optimization version of TSP. Fortnow clarifies that there is a standard decision problem associated with the TSP that is NP-complete and isn’t bothered with the technicalities. Since this is my blog I’ll also link to my response.

Grötschel, Lovász, and Schrijver are sympathetic to Fortnow’s viewpoint in Geometric Algorithms and Combinatorial Optimization:

Since we do not want to elaborate on the subtle differences between the various problem classes related to the class NP we shall use the following quite customary convention. In addition to all decision problems that are NP-complete, we call every optimization problem NP-complete for which the associated decision problem is NP-complete.

I realize my opinion means nothing compared to that of greats like Fortnow, Grötschel, Lovász, and Schrijver. Still, I sit on the “let’s try to be precise” side of this debate.

John von Neumann’s dog

From page 311 of Norman Macrae‘s biography of John Von Neumann:

Johnny then appointed Julian Bigelow as chief engineer after an interview that caught some of the mood of the project. It is recounted in Ed Regis’s book, and Bigelow confirmed it to me. Bigelow came down from Massachusetts in an ancient jalopy that arrived at 26 Westcott Road two hours late. There was a Great Dane dog prancing on the lawn, and it squeezed in past Johnny and Bigelow when Johnny opened the door. During the forty-minute interview the dog licked both men and wandered all over the house. Bigelow thought that Johnny should restrain his dog better but did not like to say this. When Johnny eventually saw his visitor to the door, he inquired politely whether Bigelow always traveled with the dog. “But it wasn’t my dog,” said Bigelow later, “and it now turned out it wasn’t his either.” Johnny, being a diplomatic type, refrained from making any remarks about this odd interviewee’s behavior until the end.

The whole thing reminds me of the folkloric pinched cookies story.

2013: The year of (nonexistent) extended formulations

2013 has been the year of compact extended formulations (CEF), or rather CEFs that were shown not to exist. In a survey on CEFs, Conforti, Cornuéjols, and Zambelli describe them as perfect formulations:

By “perfect formulation”, we mean a system of linear inequalities that describes the convex hull of feasible solutions, viewed as vectors. Natural perfect formulations often have a number of inequalities that is exponential in the size of the data needed to describe the problem. Here we are particularly interested in situations where the addition of a polynomial number of extra variables allows a formulation with a polynomial number of inequalities. Such formulations are called “compact extended formulations”.

A classical example is the spanning tree polytope (STP). It has exponentially many facets when using the original variables. This may be surprising to some, given that the minimum spanning tree problem is solvable in polynomial time. However, the STP does have a CEF.

One may then wonder, just how powerful are CEFs? Do they exist for every problem? The obvious answer would be no, because this would imply that P=NP. However, can the limits of CEFs be shown without the assumption that P\neNP?

In a landmark 1991 paper, Yannakakis showed that the traveling salesman polytope has no CEF (under a relatively minor technical assumption).  Just last year, it was finally shown that Yannakakis’ technical assumption is not needed; there is no CEF for the traveling salesman problem. These papers are fascinating on their own, but are actually useful. Referees can (and should) immediately reject a paper that claims to prove P=NP with a poly-sized traveling salesman LP.

In the last year, (optimal) limits of CEFs have been shown for the clique problem. First, some context is helpful. In 1996, Håstad showed that, for any fixed \epsilon >0 the maximum clique problem cannot be approximated to within a factor of n^{1-\epsilon}, unless NP=ZPP.  Zuckerman later showed the same result under the weaker assumption that P\neNP. What about extended formulations for clique? Just this year, Braverman and Moitra showed the same hardness results hold for CEFs without the assumption that P\neNP.

So far, every problem mentioned so far fits into a nice dichotomy: the polynomial-time solvable problems have CEFs and the NP-hard problems do not. Is this always the case? Can one show that every polynomial-time solvable problem admits a CEF? If so, this would prove that P\neNP.

Alas, no. This year, Rothvoß showed that the perfect matching polytope has no CEF, despite the fact that one can optimize any linear objective over the perfect matching polytope in polynomial time. In fact, he shows that any extended formulation for perfect matching or for the traveling salesman requires 2^{\Omega(n)} entities. The previous best lower bounds for matching and the traveling salesman polytopes were \Omega (n^2) and 2^{\Omega(\sqrt n)}, respectively.

Whether or not the matching polytope has a CEF was a long-standing open question, and Lance Fortnow has justly described Rothvoß’s discovery as the complexity result of the year. But can his result be improved? The best-known upper bound on the extension complexity of the matching polytope is O^* (2^{n/2}). Can Rothvoß’s technique be refined to show a matching lower bound?

John von Neumann, a late sleeper?

Maybe I have something in common with Johnny von Neumann after all:

Well, John von Neumann always slept very late, so I was kind and I did not wake him until well after ten in the morning. When I called his hotel in London, he answered the phone in bed, and I said “Johnny, you’re quite right.” And he said to me, “You wake me up early in the morning to tell me that I’m right? Please wait until I’m wrong.”

-Jacob Bronowski

The quote appears on page 211 of Norman Macrae‘s biography of John Von Neumann.

Macrae goes on to say “Actually, Bronowski did not really understand how Johnny lived his life. First, Johnny would have mildly resented the interruption because he would by ten have been working for some hours on something else less simple.”

Darn.

Manfred Padberg on expensive love and free computing

From Manfred Padberg‘s personal reflections recounted in Mixed-integer programming—1968 and thereafter:

As one of my colleagues, Jim Savage, once said: The 1960/1970’s were the times of “free love” and “expensive computing”; the 1980/1990’s were the times of “expensive love” and “free computing”.

He goes on to describe his desire to perform computational experiments for TSP, so as to “test the theory that we had developed,” but he lacked adequate resources.

But during a year at the IBM Research Center in Yorktown Heights (1976–1977) I had the best of computing power of the time at my disposal and I finished the project in July 1977.

However, the resulting paper did not appear until 1980.

The delay was simply due to the fact that my personal life was in turmoil—divorce, etc—and Jim Savage was proven right.

Good characterizations and perfect graphs

A good characterization has both a short proof and a short refutation (i.e., belongs to NP \cap coNP). Jack Edmonds is a big believer in good characterizations, and he and fellow proselytizer Václav Chvátal did a lot of preaching. For the record, I support the hype. Good characterizations give an unreasonable amount of insight into a problem.

Good characterizations often foreshadow polynomial-time solvability. Linear programming (LP) is a famous example. Long before the ellipsoid method was shown to solve LPs in polynomial time, it was known that the question — is the optimal objective of this (maximization) LP at least z? — has both a short proof and short refutation*. Given a possible solution, one can quickly check that it satisfies the constraints and that its objective is indeed at least z. For the short refutation, we have duality; simply provide a feasible solution to the dual problem that has objective less than z.

The maximum clique and the vertex coloring problems have a similar sort of weak ‘duality’. That is, the clique number \omega (G) and chromatic number \chi (G) of a graph G satisfy \omega (G) \le \chi (G). For LPs there is strong duality; the objectives of the primal and dual LPs are equal whenever both are feasible. For clique and coloring, the ‘duality’ can be very weak. Indeed, there is a class of graphs for which the clique number is two, but the chromatic number can be arbitrarily large. Thus, we may not be able to refute the question — is the clique number at least k? — with a proper coloring. It seems that there will never be such a good characterization for either clique or coloring (unless coNP=NP).

But what happens if we restrict ourselves to the class of graphs for which \omega (G) = \chi (G)? In this case, we DO have a sort of strong duality. We can now quickly verify that \omega (G) \ge k when given a clique of size k, and we can refute \chi (G) \ge k if given a (k-1)-coloring. So, we have our good characterization, and you know what? For this restricted class of graphs, we CAN solve maximum clique in polynomial time! If we further restrict ourselves to the class of perfect graphs, where \omega (G')=\chi(G') for every vertex-induced subgraph G' of G, then all sorts of mathematical beauty emerge.

This leads me to the research topic generator: First look for a good characterization. If you can’t find one, artificially impose it and see what happens.

________________________________________________

*The proof that LP belongs to NP or coNP is actually more complicated than I make it seem. One needs to show that there is an optimal solution whose size (when represented in binary) is polynomial in the input size.