Boole's expansion theorem explained

Boole's expansion theorem, often referred to as the Shannon expansion or decomposition, is the identity:

F=xFx+x'Fx'

, where

F

is any Boolean function,

x

is a variable,

x'

is the complement of

x

, and

Fx

and

Fx'

are

F

with the argument

x

set equal to

1

and to

0

respectively.

The terms

Fx

and

Fx'

are sometimes called the positive and negative Shannon cofactors, respectively, of

F

with respect to

x

. These are functions, computed by restrict operator,

\operatorname{restrict}(F,x,0)

and

\operatorname{restrict}(F,x,1)

(see valuation (logic) and partial application).

It has been called the "fundamental theorem of Boolean algebra". Besides its theoretical importance, it paved the way for binary decision diagrams (BDDs), satisfiability solvers, and many other techniques relevant to computer engineering and formal verification of digital circuits.In such engineering contexts (especially in BDDs), the expansion is interpreted as a if-then-else, with the variable

x

being the condition and the cofactors being the branches (

Fx

when

x

is true and respectively

Fx'

when

x

is false).[1]

Statement of the theorem

A more explicit way of stating the theorem is:

f(X1,X2,...,Xn)=X1f(1,X2,...,Xn)+X1'f(0,X2,...,Xn)

Variations and implications

XOR-Form : The statement also holds when the disjunction "+" is replaced by the XOR operator:

f(X1,X2,...,Xn)=X1f(1,X2,...,Xn)X1'f(0,X2,...,Xn)

Dual form: There is a dual form of the Shannon expansion (which does not have a related XOR form):

f(X1,X2,...,Xn)=(X1+f(0,X2,...,Xn))(X1'+f(1,X2,...,Xn))

Repeated application for each argument leads to the Sum of Products (SoP) canonical form of the Boolean function

f

. For example for

n=2

that would be

\begin{align} f(X1,X2)&=X1f(1,X2)+X1'f(0,X2)\\ &=X1X2f(1,1)+X1X2'f(1,0)+X1'X2f(0,1)+X1'X2'f(0,0) \end{align}

Likewise, application of the dual form leads to the Product of Sums (PoS) canonical form (using the distributivity law of

+

over

):

\begin{align} f(X1,X2)&=(X1+f(0,X2))(X1'+f(1,X2))\\ &=(X1+X2+f(0,0))(X1+X2'+f(0,1))(X1'+X2+f(1,0))(X1'+X2'+f(1,1)) \end{align}

Properties of cofactors

Linear properties of cofactors:
  • For a Boolean function F which is made up of two Boolean functions G and H the following are true:
  • If

    F=H'

    then

    Fx=H'x

    If

    F=GH

    then

    Fx=GxHx

    If

    F=G+H

    then

    Fx=Gx+Hx

    If

    F=GH

    then

    Fx=GxHx

    Characteristics of unate functions:
  • If F is a unate function and...
  • If F is positive unate then

    F=xFx+Fx'

    If F is negative unate then

    F=Fx+x'Fx'

    Operations with cofactors

    Boolean difference:
  • The Boolean difference or Boolean derivative of the function F with respect to the literal x is defined as:
    \partialF
    \partialx

    =FxFx'

    Universal quantification:
  • The universal quantification of F is defined as:

    \forallxF=FxFx'

    Existential quantification:
  • The existential quantification of F is defined as:

    \existsxF=Fx+Fx'

    History

    George Boole presented this expansion as his Proposition II, "To expand or develop a function involving any number of logical symbols", in his Laws of Thought (1854), and it was "widely applied by Boole and other nineteenth-century logicians".

    Claude Shannon mentioned this expansion, among other Boolean identities, in a 1949 paper, and showed the switching network interpretations of the identity. In the literature of computer design and switching theory, the identity is often incorrectly attributed to Shannon.

    Application to switching circuits

    1. Binary decision diagrams follow from systematic use of this theorem
    2. Any Boolean function can be implemented directly in a switching circuit using a hierarchy of basic multiplexer by repeated application of this theorem.

    See also

    External links

    Notes and References

    1. G. D. Hachtel and F. Somezi (1996), Logic Synthesis and Verification Algorithms, p. 234