# Sharp P and the issue of `natural problems’

blog.computationalcomplexity.org

#P was defined by Valiant as a way to pin down that the PERMANENT of a matrix is hard to compute.

The definition I give is equivalent to the one Valiant gave.

g is in #P if there exists p a poly and B in P such that

g(x) = | { y : |y| = p(|x|) and (x,y) in B } |

A function f is #P-complete if g is in #P and for all g in #P, f is poly-Turing reducible to g.

#SAT is the function that, given a formula, returns the number of satisfying assignments. It is #P-complete by looking at the proof the Cook-Levin Theorem. The reduction of f to #SAT only makes one query to #SAT. A common way to show that #A is #P-complete is to show that SAT le A with a reduction that preserves the number of solutions.

Valiant proved that PERM was #P-complete (his reduction only used 1 call to PERM).

There are problems in P whose #-version is #-P complete: Matching and DNF-SAT are two of them.

Notice that I defined #SAT directly, not in terms of a poly p and a set B as above. Here is why: if you use poly p and set B one can do obnoxious things like:

SAT = { phi : exists yz 2n-bits long such that phi(y)=T and z is prime }

The # version of this definition is not really what I want (though I am sure its #P-complete).

Valiant (see here and here) and Simon (see here) showed that for many known NPC-problems A, #A is #P-complete. They meant NATURAL problems. Is it true for all natural NP-complete problems?

Unfortunately the statement `All NATURAL NPC problems give rise to #P-complete functions’ is hard (impossible?) to state rigorously and hence hard (impossible?) to prove.

1) Is there a natural A in NP such that #A is NOT #P-complete (under assumptions)?

2) Are there any theorems that show a large set of NPC problems have #P counterparts? Or are we doomed to, when we want to show some #A is #P-complete, come up with a new proof?

3) Can one PROVE there are NPC problems A such that #A is NOT #P-complete? (under assumptions).