Isolation lemma explained

In theoretical computer science, the term isolation lemma (or isolating lemma) refers to randomized algorithms that reduce the number of solutions to a problem to one, should a solution exist.This is achieved by constructing random constraints such that, with non-negligible probability, exactly one solution satisfies these additional constraints if the solution space is not empty.Isolation lemmas have important applications in computer science, such as the Valiant–Vazirani theorem and Toda's theorem in computational complexity theory.

The first isolation lemma was introduced by, albeit not under that name.Their isolation lemma chooses a random number of random hyperplanes, and has the property that, with non-negligible probability, the intersection of any fixed non-empty solution space with the chosen hyperplanes contains exactly one element. This suffices to show the Valiant–Vazirani theorem:there exists a randomized polynomial-time reduction from the satisfiability problem for Boolean formulas to the problem of detecting whether a Boolean formula has a unique solution. introduced an isolation lemma of a slightly different kind:Here every coordinate of the solution space gets assigned a random weight in a certain range of integers, and the property is that, with non-negligible probability, there is exactly one element in the solution space that has minimum weight. This can be used to obtain a randomized parallel algorithm for the maximum matching problem.

Stronger isolation lemmas have been introduced in the literature to fit different needs in various settings.For example, the isolation lemma of has similar guarantees as that of Mulmuley et al., but it uses fewer random bits.In the context of the exponential time hypothesis, prove an isolation lemma for k-CNF formulas.Noam Ta-Shma[1] gives an isolation lemma with slightly stronger parameters, and gives non-trivial results even when the size of the weight domain is smaller than the number of variables.

The isolation lemma of Mulmuley, Vazirani, and Vazirani

Lemma. Let

n

and

N

be positive integers, and let

lF

be an arbitrary nonempty family of subsets of the universe

\{1,...,n\}

. Suppose each element

x\in\{1,...,n\}

in the universe receives an integer weight

w(x)

, each of which is chosen independently and uniformly at random from

\{1,...,N\}

. The weight of a set S in

lF

is defined as

w(S)=\sumxw(x).

Then, with probability at least

1-n/N

, there is a unique set in

lF

that has the minimum weight among all sets of

lF

.It is remarkable that the lemma assumes nothing about the nature of the family

lF

: for instance

lF

may include all

2n-1

nonempty subsets. Since the weight of each set in

lF

is between

1

and

nN

on average there will be

(2n-1)/(nN)

sets of each possible weight.Still, with high probability, there is a unique set that has minimum weight.

Mulmuley, Vazirani, and Vazirani's proof

Suppose we have fixed the weights of all elements except an element x. Then x has a threshold weight α, such that if the weight w(x) of x is greater than α, then it is not contained in any minimum-weight subset, and if

w(x)\le\alpha

, then it is contained in some sets of minimum weight. Further, observe that if

w(x)<\alpha

, then every minimum-weight subset must contain x (since, when we decrease w(x) from α, sets that do not contain x do not decrease in weight, while those that contain x do). Thus, ambiguity about whether a minimum-weight subset contains x or not can happen only when the weight of x is exactly equal to its threshold; in this case we will call x "singular". Now, as the threshold of x was defined only in terms of the weights of the other elements, it is independent of w(x), and therefore, as w(x) is chosen uniformly from,

\Pr[xissingular]=\Pr[w(x)=\alpha]\le1/N

and the probability that some x is singular is at most n/N. As there is a unique minimum-weight subset iff no element is singular, the lemma follows.

Remark: The lemma holds with

\le

(rather than =) since it is possible that some x has no threshold value (i.e., x will not be in any minimum-weight subset even if w(x) gets the minimum possible value, 1).

Joel Spencer's proof

This is a restatement version of the above proof, due to Joel Spencer (1995).

For any element x in the set, define

\alpha(x)=minSw(S)-minS\inlF,w(S\setminus\{x\}).

Observe that

\alpha(x)

depends only on the weights of elements other than x, and not on w(x) itself. So whatever the value of

\alpha(x)

, as w(x) is chosen uniformly from, the probability that it is equal to

\alpha(x)

is at most 1/N. Thus the probability that

w(x)=\alpha(x)

for some x is at most n/N.

Now if there are two sets A and B in

lF

with minimum weight, then, taking any x in A\B, we have

\begin{align} \alpha(x)&=minSw(S)-minS\inlF,w(S\setminus\{x\})\\ &=w(B)-(w(A)-w(x))\\ &=w(x), \end{align}

and as we have seen, this event happens with probability at most n/N.

Examples/applications

lF

is the set of perfect matchings, so that with probability at least 1/2, there exists a unique perfect matching. When each indeterminate

xij

in the Tutte matrix of the graph is replaced with
wij
2
where

wij

is the random weight of the edge, we can show that the determinant of the matrix is nonzero, and further use this to find the matching.

(S,lF)

, find a set in

lF

" could be reduced to a decision problem of the form "Is there a set in

lF

with total weight at most k?". For instance, it showed how to solve the following problem posed by Papadimitriou and Yannakakis, for which (as of the time the paper was written) no deterministic polynomial-time algorithm is known: given a graph and a subset of the edges marked as "red", find a perfect matching with exactly k red edges.

References

External links

Notes and References

  1. Noam Ta-Shma (2015); A simple proof of the Isolation Lemma, in eccc