Conditional dependence explained

See also: Conditional independence.

In probability theory, conditional dependence is a relationship between two or more events that are dependent when a third event occurs.[1] [2] For example, if

A

and

B

are two events that individually increase the probability of a third event

C,

and do not directly affect each other, then initially (when it has not been observed whether or not the event

C

occurs)[3] [4] \operatorname(A \mid B) = \operatorname(A) \quad \text \quad \operatorname(B \mid A) = \operatorname(B) (

AandB

are independent).

But suppose that now

C

is observed to occur. If event

B

occurs then the probability of occurrence of the event

A

will decrease because its positive relation to

C

is less necessary as an explanation for the occurrence of

C

(similarly, event

A

occurring will decrease the probability of occurrence of

B

). Hence, now the two events

A

and

B

are conditionally negatively dependent on each other because the probability of occurrence of each is negatively dependent on whether the other occurs. We have[5] \operatorname(A \mid C \text B) < \operatorname(A \mid C).

((A\perp\perpB)\midC)

.[6] In conditional independence two events (which may be dependent or not) become independent given the occurrence of a third event.[7]

Example

In essence probability is influenced by a person's information about the possible occurrence of an event. For example, let the event

A

be 'I have a new phone'; event

B

be 'I have a new watch'; and event

C

be 'I am happy'; and suppose that having either a new phone or a new watch increases the probability of my being happy. Let us assume that the event

C

has occurred – meaning 'I am happy'. Now if another person sees my new watch, he/she will reason that my likelihood of being happy was increased by my new watch, so there is less need to attribute my happiness to a new phone.

To make the example more numerically specific, suppose that there are four possible states

\Omega=\left\{s1,s2,s3,s4\right\},

given in the middle four columns of the following table, in which the occurrence of event

A

is signified by a

1

in row

A

and its non-occurrence is signified by a

0,

and likewise for

B

and

C.

That is,

A=\left\{s2,s4\right\},B=\left\{s3,s4\right\},

and

C=\left\{s2,s3,s4\right\}.

The probability of

si

is

1/4

for every

i.

Event

\operatorname{P}(s1)=1/4

\operatorname{P}(s2)=1/4

\operatorname{P}(s3)=1/4

\operatorname{P}(s4)=1/4

Probability of event

A

0 1 0 1

\tfrac{1}{2}

B

0 0 1 1

\tfrac{1}{2}

C

0 1 1 1

\tfrac{3}{4}

and so

Event

s1

s2

s3

s4

Probability of event

A\capB

0 0 0 1

\tfrac{1}{4}

A\capC

0 1 0 1

\tfrac{1}{2}

B\capC

0 0 1 1

\tfrac{1}{2}

A\capB\capC

0 0 0 1

\tfrac{1}{4}

In this example,

C

occurs if and only if at least one of

A,B

occurs. Unconditionally (that is, without reference to

C

),

A

and

B

are independent of each other because

\operatorname{P}(A)

—the sum of the probabilities associated with a

1

in row

A

—is

\tfrac{1}{2},

while \operatorname(A\mid B) = \operatorname(A \text B) / \operatorname(B) = \tfrac = \tfrac = \operatorname(A).But conditional on

C

having occurred (the last three columns in the table), we have \operatorname(A \mid C) = \operatorname(A \text C) / \operatorname(C) = \tfrac = \tfracwhile \operatorname(A \mid C \text B) = \operatorname(A \text C \text B) / \operatorname(C \text B) = \tfrac = \tfrac < \operatorname(A \mid C).Since in the presence of

C

the probability of

A

is affected by the presence or absence of

B,A

and

B

are mutually dependent conditional on

C.

Notes and References

  1. Introduction to Artificial Intelligence by Sebastian Thrun and Peter Norvig, 2011 "Unit 3: Conditional Dependence"
  2. Introduction to learning Bayesian Networks from Data by Dirk Husmeier http://www.bioss.sari.ac.uk/staff/dirk/papers/sbb_bnets.pdf "Introduction to Learning Bayesian Networks from Data -Dirk Husmeier"
  3. Conditional Independence in Statistical theory "Conditional Independence in Statistical Theory", A. P. Dawid"
  4. Probabilistic independence on Britannica "Probability->Applications of conditional probability->independence (equation 7) "
  5. Introduction to Artificial Intelligence by Sebastian Thrun and Peter Norvig, 2011 "Unit 3: Explaining Away"
  6. Book: Bouckaert, Remco R. . Selecting Models from Data, Artificial Intelligence and Statistics IV . . 1994 . 978-0-387-94281-0 . Cheeseman . P. . Lecture Notes in Statistics . 89 . 101-111, especially 104 . EN . 11. Conditional dependence in probabilistic networks . Oldford . R. W..
  7. Conditional Independence in Statistical theory "Conditional Independence in Statistical Theory", A. P. Dawid