The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states
\rho
\sigma
S(\rho,\sigma)
H(\rho,\sigma)
In this article, we will use
S(\rho,\sigma)
X
H(X)
X
X
X
H(X)=0
X
n
X
H(X)=log2(n)
In quantum information theory, the notion of entropy is extended from probability distributions to quantum states, or density matrices. For a state
\rho
-\operatorname{Tr}\rholog\rho.
Applying the spectral theorem, or Borel functional calculus for infinite dimensional systems, we see that it generalizes the classical entropy. The physical meaning remains the same. A maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy. On the other hand, a pure state, or a rank one projection, will have zero von Neumann entropy. We write the von Neumann entropy
S(\rho)
H(\rho)
Given a quantum system with two subsystems A and B, the term joint quantum entropy simply refers to the von Neumann entropy of the combined system. This is to distinguish from the entropy of the subsystems.In symbols, if the combined system is in state
\rhoAB
the joint quantum entropy is then
S(\rhoA,\rhoB)=S(\rhoAB)=-\operatorname{Tr}(\rhoABlog(\rhoAB)).
Each subsystem has its own entropy. The state of the subsystems are given by the partial trace operation.
The classical joint entropy is always at least equal to the entropy of each individual system. This is not the case for the joint quantum entropy. If the quantum state
\rhoAB
Consider a maximally entangled state such as a Bell state. If
\rhoAB
\left|\Psi\right\rangle=
1 | |
\sqrt{2 |
then the total system is a pure state, with entropy 0, while each individual subsystem is a maximally mixed state, with maximum von Neumann entropy
log2=1
Notice that the above phenomenon cannot occur if a state is a separable pure state. In that case, the reduced states of the subsystems are also pure. Therefore, all entropies are zero.
The joint quantum entropy
S(\rhoAB)
S(\rhoA|\rhoB) \stackrel{def
and the quantum mutual information:
I(\rhoA:\rhoB) \stackrel{def
These definitions parallel the use of the classical joint entropy to define the conditional entropy and mutual information.