Polynomial creativity explained

In computational complexity theory, polynomial creativity is a theory analogous to the theory of creative sets in recursion theory and mathematical logic. The are a family of formal languages in the complexity class NP whose complements certifiably do not have nondeterministic recognition algorithms. It is generally believed that NP is unequal to co-NP (the class of complements of languages in NP), which would imply more strongly that the complements of all NP-complete languages do not have polynomial-time nondeterministic recognition algorithms. However, for the sets, the lack of a (more restricted) recognition algorithm can be proven, whereas a proof that remains elusive.

The sets are conjectured to form counterexamples to the Berman–Hartmanis conjecture on isomorphism of NP-complete sets. It is NP-complete to test whether an input string belongs to any one of these languages, but no polynomial time isomorphisms between all such languages and other NP-complete languages are known. Polynomial creativity and the sets were introduced in 1985 by Deborah Joseph and Paul Young, following earlier attempts to define polynomial analogues for creative sets by Ko and

Definition

Intuitively, a set is creative when there is a polynomial-time algorithm that creates a counterexample for any candidate fast nondeterministic recognition algorithm for its complement.

The classes of fast nondeterministic recognition algorithms are formalized by Joseph and Young as the sets

NP(k)

of nondeterministic Turing machine programs

p

that, for inputs

x

that they accept, have an accepting path with a number of steps that is at most This notation should be distinguished with that for the complexity class NP. The complexity class NP is a set of formal languages, while

NP(k)

is instead a set of programs that accept some of these languages. Every language in NP is recognized by a program in one of the sets with a parameter

k

that is (up to the factor

|p|

in the bound on the number of steps) the exponent in the polynomial running time of the

According to Joseph and Young's theory, a language

L

in NP is if it is possible to find a witness showing that the complement of

L

is not recognized by any program More formally, there should exist a polynomially computable function

f

that maps programs in this class to inputs on which they fail. When given a nondeterministic program

p

the function

f

should produce an input string

x=f(p)

that either belongs to

L

and causes the program to or does not belong to

L

and causes the program to The function

f

is called a productive function If this productive function exists, the given program does not produce the behavior on input

x

that would be expected of a program for recognizing the complement

Existence

Joseph and Young construct creative languages by reversing the definitions of these languages: rather than starting with a language and trying to find a productive function for it, they start with a function and construct a language for which it is the productive function. They define a polynomial-time function

f

to be polynomially honest if its running time is at most a polynomial function of its output length. This disallows, for instance, functions that take polynomial time but produce outputs of less than polynomial length. As they show, every one-to-one polynomially-honest function

f

is the productive function for a

Joseph and Young define K_f^k to be the set of values

f(p)

for nondeterministic programs

p

that have an accepting path for

f(p)

using at most |p|(|f(p)|^k+1) steps. This number of steps (on that input) would be consistent with

p

belonging Then
k
K
f
belongs to NP: given an input

f(p)

one can nondeterministically guess both

p

and its accepting path, and then verify that the input equals

f(p)

and that the path is valid

Language K_f^k is with

f

as its productive function, because every program

p

in

NP(k)

is mapped by

f

to a value

f(p)

that is either accepted by

p

(and therefore also belongs to
k
K
f
) or rejected by

p

(and therefore also does not belong

Completeness

Every set with a polynomially honest productive function is NP-complete. For any other language

X

in NP, by the definition of NP, one can translate any input

x

for

X

into a nondeterministic program

px

that ignores its own input and instead searches for a witness accepting its input if it finds one and rejecting otherwise. The length of

px

is polynomial in the size of

x

and a padding argument can be used to make

px

long enough (but still polynomial) for its running time to qualify for membership Let

f

be the productive function used to define a given and let

g

be the translation from

x

Then the composition of

g

with

f

maps inputs of

X

into counterexamples for the algorithms that test those inputs. This composition maps inputs that belong to

X

into strings that belong and inputs that do not belong to

X

into strings that do not belong Thus, it is a polynomial-time many-one reduction from

X

Since

L

is (by definition) in NP, and every other language in NP has a reduction to it, it must be

It is also possible to prove more strongly that there exists an invertible parsimonious reduction to the

Application to the Berman–Hartmanis conjecture

The Berman–Hartmanis conjecture states that there exists a polynomial-time isomorphism between any two NP-complete sets: a function that maps yes-instances of one such set one-to-one into yes-instances of the other, takes polynomial time, and whose inverse function can also be computed in polynomial time. It was formulated by Leonard C. Berman and Juris Hartmanis in 1977, based on the observation that all NP-complete sets known at that time were isomorphic.An equivalent formulation of the conjecture is that every NP-complete set is paddable. This means that there exists a polynomial-time and polynomial-time-invertible one-to-one transformation

h(x,y)

from yes-instances

x

to larger yes-instances that encode the "irrelevant"

However, it is unknown how to find such a padding transformation for a language whose productive function is not polynomial-time-invertible. Therefore, if one-way permutations exist, the languages having these permutations as their productive functions provide candidate counterexamples to the Berman–Hartmanis

The (unproven) Joseph–Young conjecture formalizes this reasoning. The conjecture states that there exists a one-way length-increasing function

f

such that
k
K
f
is not paddable. Alan Selman observed that this would imply a simpler conjecture, the encrypted complete set conjecture: there exists a one-way function

f

such that

SAT

(the set of yes-instances for the satisfiability problem) and

f(SAT)

are There exists an oracle relative to which one-way functions exist, both of these conjectures are false, and the Berman–Hartmanis conjecture is