I was NOT the guy behind that regrettable banner

If you google my name and my employer (also my undergrad alma mater), you’ll see something like this:


Those first two links are about me. The last two are NOT.

The news articles tell the story of an undergraduate student attending Oklahoma State University (OSU) who created a banner with the insensitive hashtag #Trail_of_Tears. “Trail of Tears” refers to the forced relocations of various Native American nations in which thousands died. This student brought his regrettable banner to ESPN’s College GameDay Show for the Aug 30, 2014 football game against Florida State University (FSU). FSU’s athletic teams are known as the Florida State Seminoles, and the Seminole Tribe was one of tribes that was forcibly relocated. (Surprisingly, the Seminole Tribe has actually endorsed this nickname.)

Since OSU is my undergraduate alma mater, this has led several people to mistake this other Austin Buchanan as me (and even confront me about it!). I attended OSU from 2007 to 2011 and this incident occurred in 2014, so it should be clear that the guy behind the banner is not me. But not everyone reads the new stories in detail or knows when I attended OSU. Hence, the motivation for this post.

A fallacious argument in integer programming

A commonly used argument in integer programming goes as follows. Suppose we want to show that inequality ax \le b does not induce a facet of a polyhedron. If we know valid inequalities (1) and (2) that add up to ax \le b, then we have shown that ax\le b cannot induce a facet. Right?

Not quite.

One problem occurs when inequalities (1) and (2) are multiples of ax\le b. For example, suppose that each inequality is (ax)/2 \le b/2.

What if the inequalities are not multiples of ax \le b?

Still no cigar.

Another problem occurs when the polyhedron is not full-dimensional. Consider the set of all (x,y)\in [0,1]^2 such that x\le 0 (meaning that we have an implicit equality constraint x=0). We want to argue that the inequality x+y\le 1 does NOT induce a facet. This seems to follow from the valid inequalities x\le 0 and y \le 1 which sum to x+y\le 1. However, the inequality x+y\le 1 DOES induce a facet; the dimension of our feasible region is one, the inequality x+y\le 1 is valid, and the face where x+y=1 has dimension zero.

The ellipsoid method as a lion hunt in the Sahara

From Grötschel, Lovász, and Schrijver’s Geometric Algorithms and Combinatorial Optimization:

Recall the well-known method of catching a lion in the Sahara. It works as follows. Fence in the Sahara, and split it into two parts; check which part does not contain the lion, fence the other part in, and continue. After a finite number of steps we will have caught the lion — if there was any — because the fenced-in zone will be so small that the lion cannot move anymore. Or we realize that the fenced-in zone is so small that it cannot contain any lion, i.e., there was no lion at all. In order to illustrate the ellipsoid method by this old hunter’s story we have to describe what our Sahara is, how we split it into two parts, how we fence these in, and when we can declare the lion caught or nonexistent.

For a pictorial explanation, see here.

Grötschel, Lovász, and Schrijver weigh in on the “Is TSP NP-complete?” debate

It’s just semantics.

In last month there’s been a tussle regarding whether or not the Traveling Salesman Problem (TSP) is NP-complete. Puget argues that it is not, because he’s thinking of the optimization version of TSP. Fortnow clarifies that there is a standard decision problem associated with the TSP that is NP-complete and isn’t bothered with the technicalities. Since this is my blog I’ll also link to my response.

Grötschel, Lovász, and Schrijver are sympathetic to Fortnow’s viewpoint in Geometric Algorithms and Combinatorial Optimization:

Since we do not want to elaborate on the subtle differences between the various problem classes related to the class NP we shall use the following quite customary convention. In addition to all decision problems that are NP-complete, we call every optimization problem NP-complete for which the associated decision problem is NP-complete.

I realize my opinion means nothing compared to that of greats like Fortnow, Grötschel, Lovász, and Schrijver. Still, I sit on the “let’s try to be precise” side of this debate.

John von Neumann’s dog

From page 311 of Norman Macrae‘s biography of John Von Neumann:

Johnny then appointed Julian Bigelow as chief engineer after an interview that caught some of the mood of the project. It is recounted in Ed Regis’s book, and Bigelow confirmed it to me. Bigelow came down from Massachusetts in an ancient jalopy that arrived at 26 Westcott Road two hours late. There was a Great Dane dog prancing on the lawn, and it squeezed in past Johnny and Bigelow when Johnny opened the door. During the forty-minute interview the dog licked both men and wandered all over the house. Bigelow thought that Johnny should restrain his dog better but did not like to say this. When Johnny eventually saw his visitor to the door, he inquired politely whether Bigelow always traveled with the dog. “But it wasn’t my dog,” said Bigelow later, “and it now turned out it wasn’t his either.” Johnny, being a diplomatic type, refrained from making any remarks about this odd interviewee’s behavior until the end.

The whole thing reminds me of the folkloric pinched cookies story.

2013: The year of (nonexistent) extended formulations

2013 has been the year of compact extended formulations (CEF), or rather CEFs that were shown not to exist. In a survey on CEFs, Conforti, Cornuéjols, and Zambelli describe them as perfect formulations:

By “perfect formulation”, we mean a system of linear inequalities that describes the convex hull of feasible solutions, viewed as vectors. Natural perfect formulations often have a number of inequalities that is exponential in the size of the data needed to describe the problem. Here we are particularly interested in situations where the addition of a polynomial number of extra variables allows a formulation with a polynomial number of inequalities. Such formulations are called “compact extended formulations”.

A classical example is the spanning tree polytope (STP). It has exponentially many facets when using the original variables. This may be surprising to some, given that the minimum spanning tree problem is solvable in polynomial time. However, the STP does have a CEF.

One may then wonder, just how powerful are CEFs? Do they exist for every problem? The obvious answer would be no, because this would imply that P=NP. However, can the limits of CEFs be shown without the assumption that P\neNP?

In a landmark 1991 paper, Yannakakis showed that the traveling salesman polytope has no CEF (under a relatively minor technical assumption).  Just last year, it was finally shown that Yannakakis’ technical assumption is not needed; there is no CEF for the traveling salesman problem. These papers are fascinating on their own, but are actually useful. Referees can (and should) immediately reject a paper that claims to prove P=NP with a poly-sized traveling salesman LP.

In the last year, (optimal) limits of CEFs have been shown for the clique problem. First, some context is helpful. In 1996, Håstad showed that, for any fixed \epsilon >0 the maximum clique problem cannot be approximated to within a factor of n^{1-\epsilon}, unless NP=ZPP.  Zuckerman later showed the same result under the weaker assumption that P\neNP. What about extended formulations for clique? Just this year, Braverman and Moitra showed the same hardness results hold for CEFs without the assumption that P\neNP.

So far, every problem mentioned so far fits into a nice dichotomy: the polynomial-time solvable problems have CEFs and the NP-hard problems do not. Is this always the case? Can one show that every polynomial-time solvable problem admits a CEF? If so, this would prove that P\neNP.

Alas, no. This year, Rothvoß showed that the perfect matching polytope has no CEF, despite the fact that one can optimize any linear objective over the perfect matching polytope in polynomial time. In fact, he shows that any extended formulation for perfect matching or for the traveling salesman requires 2^{\Omega(n)} entities. The previous best lower bounds for matching and the traveling salesman polytopes were \Omega (n^2) and 2^{\Omega(\sqrt n)}, respectively.

Whether or not the matching polytope has a CEF was a long-standing open question, and Lance Fortnow has justly described Rothvoß’s discovery as the complexity result of the year. But can his result be improved? The best-known upper bound on the extension complexity of the matching polytope is O^* (2^{n/2}). Can Rothvoß’s technique be refined to show a matching lower bound?

John von Neumann, a late sleeper?

Maybe I have something in common with Johnny von Neumann after all:

Well, John von Neumann always slept very late, so I was kind and I did not wake him until well after ten in the morning. When I called his hotel in London, he answered the phone in bed, and I said “Johnny, you’re quite right.” And he said to me, “You wake me up early in the morning to tell me that I’m right? Please wait until I’m wrong.”

Jacob Bronowski

The quote appears on page 211 of Norman Macrae‘s biography of John Von Neumann.

Macrae goes on to say “Actually, Bronowski did not really understand how Johnny lived his life. First, Johnny would have mildly resented the interruption because he would by ten have been working for some hours on something else less simple.”