In theoretical computer science, multiparty communication complexity is the study of communication complexity in the setting where there are more than 2 players.
In the traditional two–party communication game, introduced by,[1] two players, P1 and P2 attempt to compute a Boolean function
f(x1,x
n\to\{0,1\}, | |
2):\{0,1\} |
x1,x
n' | |
2\in\{0,1\} |
, 2n'=n
Player P1 knows the value of x2, P2 knows the value of x1, but Pi does not know the value of xi, for i = 1, 2.
In other words, the players know the other's variables, but not their own. The minimum number of bits that must be communicated by the players to compute f is the communication complexity of f, denoted by κ(f).
The multiparty communication game, defined in 1983,[2] is a powerful generalization of the 2–party case: Here the players know all the others' input, except their own. Because of this property, sometimes this model is called "numbers on the forehead" model, since if the players were seated around a round table, each wearing their own input on the forehead, then every player would see all the others' input, except their own.
The formal definition is as follows:
k
P1,P2,...,Pk
f(x1,x2,\ldots,x
n | |
n):\{0,1\} |
\to\{0,1\}
On set
S=\{x1,x2,...,xn\}
A
k
A1,A2,...,Ak
P1
Ai
i=1,2,...,k
The aim is to compute
f(x1,x2,...,xn
x=(x1,x2,...,xn)
A=(A1,A2,...,Axn)
x
A
k
(k) | |
C | |
A |
(f)
f
A
k
f
k
f
C(k)
(k) | |
(f)=max | |
A(f) |
where the maximum is taken over all k-partitions of set
x=(x1,x2,...,xn)
For a general upper bound both for two and more players, let us suppose that A1 is one of the smallest classes of the partition A1,A2,...,Ak. Then P1 can compute any Boolean function of S with |A1| + 1 bits of communication: P2 writes down the |A1| bits of A1 on the blackboard, P1 reads it, and computes and announces the value
f(x)
C(k)(f)\leq\lfloor{n\overk}\rfloor+1.
The Generalized Inner Product function (GIP)[3] is defined as follows:Let
y1,y2,...,yk
n
Y
n
k
k
y1,y2,...,yk
GIP(y1,y2,...,yk)
Y
y1,y2,...,yk
k
n
k
It was shown that
C(k)(GIP)\geqc{n\over4k},
with a constant c > 0.
An upper bound on the multiparty communication complexity of GIP shows[4] that
C(k)(GIP)\leqc{n\over2k},
with a constant c > 0.
For a general Boolean function f, one can bound the multiparty communication complexity of f by using its L1 norm[5] as follows:[6]
C(k)(f)=O(k2log(nL1(f))\lceil{n
2(f)\over2 | |
L | |
1 |
k}\rceil)
A construction of a pseudorandom number generator was based on the BNS lower bound for the GIP function.