Min-entropy explained

The min-entropy, in information theory, is the smallest of the Rényi family of entropies, corresponding to the most conservative way of measuring the unpredictability of a set of outcomes, as the negative logarithm of the probability of the most likely outcome. The various Rényi entropies are all equal for a uniform distribution, but measure the unpredictability of a nonuniform distribution in different ways. The min-entropy is never greater than the ordinary or Shannon entropy (which measures the average unpredictability of the outcomes) and that in turn is never greater than the Hartley or max-entropy, defined as the logarithm of the number of outcomes with nonzero probability.

As with the classical Shannon entropy and its quantum generalization, the von Neumann entropy, one can define a conditional version of min-entropy. The conditional quantum min-entropy is a one-shot, or conservative, analog of conditional quantum entropy.

To interpret a conditional information measure, suppose Alice and Bob were to share a bipartite quantum state

\rhoAB

. Alice has access to system

A

and Bob to system

B

. The conditional entropy measures the average uncertainty Bob has about Alice's state upon sampling from his own system. The min-entropy can be interpreted as the distance of a state from a maximally entangled state.

This concept is useful in quantum cryptography, in the context of privacy amplification (See for example [1]).

Definition for classical distributions

If

P=(p1,...,pn)

is a classical finite probability distribution, its min-entropy can be defined as[2] H_(\boldsymbol P) = \log\frac,\qquad P_\equiv \max_i p_i.One way to justify the name of the quantity is to compare it with the more standard definition of entropy, which reads

H(\boldsymbolP)=\sumipilog(1/pi)

, and can thus be written concisely as the expectation value of

log(1/pi)

over the distribution. If instead of taking the expectation value of this quantity we take its minimum value, we get precisely the above definition of

H\rm(\boldsymbolP)

.

Definition for quantum states

A natural way to define a "min-entropy" for quantum states is to leverage the simple observation that quantum states result in probability distributions when measured in some basis. There is however the added difficulty that a single quantum state can result in infinitely many possible probability distributions, depending on how it is measured. A natural path is then, given a quantum state

\rho

, to still define

H\rm(\rho)

as

log(1/P\rm)

, but this time defining

P\rm

as the maximum possible probability that can be obtained measuring

\rho

, maximizing over all possible projective measurements.

Formally, this would provide the definition H_(\rho) = \max_\Pi \log \frac= - \max_\Pi \log \max_i \operatorname(\Pi_i \rho), where we are maximizing over the set of all projective measurements

\Pi=(\Pii)i

,

\Pii

represent the measurement outcomes in the POVM formalism, and

\operatorname{tr}(\Pii\rho)

is therefore the probability of observing the

i

-th outcome when the measurement is

\Pi

.

A more concise method to write the double maximization is to observe that any element of any POVM is a Hermitian operator such that

0\le\Pi\leI

, and thus we can equivalently directly maximize over these to get H_(\rho) = -\max_ \log \operatorname(\Pi \rho).In fact, this maximization can be performed explicitly and the maximum is obtained when

\Pi

is the projection onto (any of) the largest eigenvalue(s) of

\rho

. We thus get yet another expression for the min-entropy as: H_(\rho) = -\log \|\rho\|_,remembering that the operator norm of a Hermitian positive semidefinite operator equals its largest eigenvalue.

Conditional entropies

Let

\rhoAB

be a bipartite density operator on the space

l{H}Al{H}B

. The min-entropy of

A

conditioned on

B

is defined to be

Hmin(A|B)\rho\equiv

-inf
\sigmaB

Dmax(\rhoAB\|IA\sigmaB)

where the infimum ranges over all density operators

\sigmaB

on the space

l{H}B

. The measure

Dmax

is the maximum relative entropy defined as

Dmax(\rho\|\sigma)=infλ\{λ:\rho\leq2λ\sigma\}

The smooth min-entropy is defined in terms of the min-entropy.

\epsilon
H
min

(A|B)\rho=\sup\rho'Hmin(A|B)\rho'

where the sup and inf range over density operators

\rho'AB

which are

\epsilon

-close to

\rhoAB

. This measure of

\epsilon

-close is defined in terms of the purified distance

P(\rho,\sigma)=\sqrt{1-F(\rho,\sigma)2}

where

F(\rho,\sigma)

is the fidelity measure.

These quantities can be seen as generalizations of the von Neumann entropy. Indeed, the von Neumann entropy can be expressed as

S(A|B)\rho=\lim\epsilon\limn → infty

1
n
\epsilon
H
min

(An|B

n)
\rho

~.

This is called the fully quantum asymptotic equipartition theorem.[3] The smoothed entropies share many interesting properties with the von Neumann entropy. For example, the smooth min-entropy satisfy a data-processing inequality:[4]
\epsilon
H
min

(A|B)\rho\geq

\epsilon
H
min

(A|BC)\rho~.

Operational interpretation of smoothed min-entropy

Henceforth, we shall drop the subscript

\rho

from the min-entropy when it is obvious from the context on what state it is evaluated.

Min-entropy as uncertainty about classical information

Suppose an agent had access to a quantum system

B

whose state
x
\rho
B
depends on some classical variable

X

. Furthermore, suppose that each of its elements

x

is distributed according to some distribution

PX(x)

. This can be described by the following state over the system

XB

.

\rhoXB=\sumxPX(x)|x\rangle\langlex|

x
\rho
B

,

where

\{|x\rangle\}

form an orthonormal basis. We would like to know what the agent can learn about the classical variable

x

. Let

pg(X|B)

be the probability that the agent guesses

X

when using an optimal measurement strategy

pg(X|B)=\sumxPX(x)tr(Ex

x)
\rho
B

,

where

Ex

is the POVM that maximizes this expression. It can be shown that this optimum can be expressed in terms of the min-entropy as

pg(X|B)=

-Hmin(X|B)
2

~.

If the state

\rhoXB

is a product state i.e.

\rhoXB=\sigmaX\tauB

for some density operators

\sigmaX

and

\tauB

, then there is no correlation between the systems

X

and

B

. In this case, it turns out that
-Hmin(X|B)
2

=maxxPX(x)~.

Min-entropy as overlap with the maximally entangled state

The maximally entangled state

|\phi+\rangle

on a bipartite system

l{H}Al{H}B

is defined as
+\rangle
|\phi
AB

=

1
\sqrt{d
} \sum_ |x_A\rangle |x_B\rangle

where

\{|xA\rangle\}

and

\{|xB\rangle\}

form an orthonormal basis for the spaces

A

and

B

respectively.For a bipartite quantum state

\rhoAB

, we define the maximum overlap with the maximally entangled state as

qc(A|B)=dAmaxl{E

} F\left((I_A \otimes \mathcal) \rho_, |\phi^+\rangle\langle \phi^|\right)^2

where the maximum is over all CPTP operations

l{E}

and

dA

is the dimension of subsystem

A

. This is a measure of how correlated the state

\rhoAB

is. It can be shown that

qc(A|B)=

-Hmin(A|B)
2
. If the information contained in

A

is classical, this reduces to the expression above for the guessing probability.

Proof of operational characterization of min-entropy

The proof is from a paper by König, Schaffner, Renner in 2008.[5] It involves the machinery of semidefinite programs.[6] Suppose we are given some bipartite density operator

\rhoAB

. From the definition of the min-entropy, we have

Hmin(A|B)=-

inf
\sigmaB

infλ\{λ|\rhoAB\leq2λ(IA\sigmaB)\}~.

This can be re-written as

-log

inf
\sigmaB

\operatorname{Tr}(\sigmaB)

subject to the conditions

\sigmaB\geq0

IA\sigmaB\geq\rhoAB~.

We notice that the infimum is taken over compact sets and hence can be replaced by a minimum. This can then be expressed succinctly as a semidefinite program. Consider the primal problem

min:\operatorname{Tr}(\sigmaB)

subjectto:IA\sigmaB\geq\rhoAB

\sigmaB\geq0~.

This primal problem can also be fully specified by the matrices

(\rhoAB

*)
,I
B,\operatorname{Tr}
where

\operatorname{Tr}*

is the adjoint of the partial trace over

A

. The action of

\operatorname{Tr}*

on operators on

B

can be written as

\operatorname{Tr}*(X)=IAX~.

We can express the dual problem as a maximization over operators

EAB

on the space

AB

as

max:\operatorname{Tr}(\rhoABEAB)

subjectto:\operatorname{Tr}A(EAB)=IB

EAB\geq0~.

Using the Choi–Jamiołkowski isomorphism, we can define the channel

l{E}

such that

dAIAl{E}\dagger(|\phi+\rangle\langle\phi+|)=EAB

where the bell state is defined over the space

AA'

. This means that we can express the objective function of the dual problem as

\langle\rhoAB,EAB\rangle=dA\langle\rhoAB,IAl{E}\dagger(|\phi+\rangle\langle\phi+|)\rangle

=dA\langleIAl{E}(\rhoAB),|\phi+\rangle\langle\phi+|)\rangle

as desired.

Notice that in the event that the system

A

is a partly classical state as above, then the quantity that we are after reduces to

maxPX(x)\langlex|

x)|x
l{E}(\rho
B

\rangle~.

We can interpret

l{E}

as a guessing strategy and this then reduces to the interpretation given above where an adversary wants to find the string

x

given access to quantum information via system

B

.

See also

Notes and References

  1. Vazirani . Umesh . Vidick . Thomas . Fully Device-Independent Quantum Key Distribution . Physical Review Letters . 113 . 14 . 29 September 2014 . 0031-9007 . 10.1103/physrevlett.113.140501 . 140501. 1210.1810 . 25325625. 2014PhRvL.113n0501V . 119299119 .
  2. König. Robert. Renner. Renato. Renato Renner. Schaffner. Christian. 2009. The Operational Meaning of Min- and Max-Entropy. IEEE Transactions on Information Theory. Institute of Electrical and Electronics Engineers (IEEE). 55. 9. 4337–4347. 0807.1338. 10.1109/tit.2009.2025545. 0018-9448. 17160454.
  3. Tomamichel . Marco . Colbeck . Roger . Renner . Renato . A Fully Quantum Asymptotic Equipartition Property . IEEE Transactions on Information Theory . Institute of Electrical and Electronics Engineers (IEEE) . 55 . 12 . 2009 . 0018-9448 . 10.1109/tit.2009.2032797 . 5840–5847. 0811.1221 . 12062282 .
  4. Renato Renner, "Security of Quantum Key Distribution", Ph.D. Thesis, Diss. ETH No. 16242
  5. König. Robert. Renner. Renato. Renato Renner. Schaffner. Christian. 2009. The Operational Meaning of Min- and Max-Entropy. IEEE Transactions on Information Theory. Institute of Electrical and Electronics Engineers (IEEE). 55. 9. 4337–4347. 0807.1338. 10.1109/tit.2009.2025545. 0018-9448. 17160454.
  6. John Watrous, Theory of quantum information, Fall 2011, course notes, https://cs.uwaterloo.ca/~watrous/CS766/LectureNotes/07.pdf