Degree of anonymity explained

In anonymity networks (e.g., Tor, Crowds, Mixmaster, I2P, etc.), it is important to be able to measure quantitatively the guarantee that is given to the system. The degree of anonymity

d

is a device that was proposed at the 2002 Privacy Enhancing Technology (PET) conference. Two papers put forth the idea of using entropy as the basis for formally measuring anonymity: "Towards an Information Theoretic Metric for Anonymity", and "Towards Measuring Anonymity". The ideas presented are very similar with minor differences in the final definition of

d

.

__TOC__

Background

Anonymity networks have been developed and many have introduced methods of proving the anonymity guarantees that are possible, originally with simple Chaum Mixes and Pool Mixes the size of the set of users was seen as the security that the system could provide to a user. This had a number of problems; intuitively if the network is international then it is unlikely that a message that contains only Urdu came from the United States, and vice versa. Information like this and via methods like the predecessor attack and intersection attack helps an attacker increase the probability that a user sent the message.

Example With Pool Mixes

As an example consider the network shown above, in here

A,B,C

and

D

are users (senders),

Q,R,S

, and

T

are servers (receivers), the boxes are mixes, and

\{A,B\}\inT

,

\{A,B,C\}\inS

and

\{A,B,C,D\}\inQ,R

where

\in

denotes the anonymity set. Now as there are pool mixes let the cap on the number of incoming messages to wait before sending be

2

; as such if

A,B

, or

C

is communicating with

R

and

S

receives a message then

S

knows that it must have come from

E

(as the links between the mixes can only have

1

message at a time). This is in no way reflected in

S

's anonymity set, but should be taken into account in the analysis of the network.

Degree of Anonymity

The degree of anonymity takes into account the probability associated with each user, it begins by defining the entropy of the system (here is where the papers differ slightly but only with notation, we will use the notation from .):

H(X):=

N
\sum
i=1

\left[pi ⋅ lg\left(

1
pi

\right)\right]

,where

H(X)

is the entropy of the network,

N

is the number of nodes in the network, and

pi

is the probability associated with node

i

.Now the maximal entropy of a network occurs when there is uniform probability associated with each node
\left(1
N

\right)

and this yields

HM:=H(X)\getslg(N)

.The degree of anonymity (now the papers differ slightly in the definition here, defines a bounded degree where it is compared to

HM

and gives an unbounded definition—using the entropy directly, we will consider only the bounded case here) is defined as

d:=1-

HM-H(X)
HM

=

H(X)
HM
.Using this anonymity systems can be compared and evaluated using a quantitatively analysis.

Definition of Attacker

These papers also served to give concise definitions of an attacker:

Internal/External : an internal attacker controls nodes in the network, whereas an external can only compromise communication channels between nodes.
  • Passive/Active : an active attacker can add, remove, and modify any messages, whereas a passive attacker can only listen to the messages.
  • Local/Global : a local attacker has access to only part of the network, whereas a global can access the entire network.
  • Example

    d

    In the papers there are a number of example calculations of

    d

    ; we will walk through some of them here.

    Crowds

    In Crowds there is a global probability of forwarding (

    pf

    ), which is the probability a node will forward the message internally instead of routing it to the final destination. Let there be

    C

    corrupt nodes and

    N

    total nodes. In Crowds the attacker is internal, passive, and local. Trivially

    HM\getslg(N-C)

    , and overall the entropy is

    H(x)\gets

    N-pf(N-C-1)
    N

    ⋅ lg\left[

    N
    N-pf(N-C-1)

    \right]+pf

    N-C-1
    N

    lg\left[N/pf\right]

    ,

    d

    is this value divided by

    HM

    .

    Onion routing

    In onion routing let's assume the attacker can exclude a subset of the nodes from the network, then the entropy would easily be

    H(X)\getslg(S)

    , where

    S

    is the size of the subset of non-excluded nodes. Under an attack model where a node can both globally listen to message passing and is a node on the path this decreases to

    H(X)\getslg(L)

    , where

    L

    is the length of the onion route (this could be larger or smaller than

    S

    ), as there is no attempt in onion routing to remove the correlation between the incoming and outgoing messages.

    Applications of this metric

    In 2004, Diaz, Sassaman, and DeWitte presented an analysis of two anonymous remailers using the Serjantov and Danezis metric, showing one of them to provide zero anonymity under certain realistic conditions.

    See also

    References

    1. See Towards Measuring Anonymity Towards measuring anonymity . Claudia Diaz and Stefaan Seys and Joris Claessens and Bart Preneel . Proceedings of Privacy Enhancing Technologies Workshop (PET 2002) . April 2002 . Roger Dingledine and Paul Syverson . Springer-Verlag, LNCS 2482 . 2005-11-10 . dead . https://web.archive.org/web/20060710023539/http://www.esat.kuleuven.ac.be/~cdiaz/papers/tmAnon.ps.gz . July 10, 2006 .
    2. See Towards an Information Theoretic Metric for Anonymity Towards an Information Theoretic Metric for Anonymity . Andrei Serjantov and George Danezis . Proceedings of Privacy Enhancing Technologies Workshop (PET 2002). April 2002 . Roger Dingledine and Paul Syverson . Springer-Verlag, LNCS 2482 . 2005-11-10. https://web.archive.org/web/20040719123728/http://www.cl.cam.ac.uk/~aas23/papers_aas/set.ps. July 19, 2004.
    3. See Comparison Between Two Practical Mix Designs Comparison Between Two Practical Mix Designs . Claudia Diaz and Len Sassaman and Evelyn Dewitte . Proceedings of European Symposium on Research in Computer Security (ESORICS 2004). September 2004 . Dieter Gollmann . Springer-Verlag, LNCS 3193. 2008-06-06.