Decision rule explained

In decision theory, a decision rule is a function which maps an observation to an appropriate action. Decision rules play an important role in the theory of statistics and economics, and are closely related to the concept of a strategy in game theory.

In order to evaluate the usefulness of a decision rule, it is necessary to have a loss function detailing the outcome of each action under different states.

Formal definition

\scriptstyle(l{X},\Sigma,P\theta)

, determined by a parameter θ ∈ Θ, and a set A of possible actions, a (deterministic) decision rule is a function δ : 

\scriptstylel{X}

→ A.

Examples of decision rules

\theta

, the domain of

\theta

may extend over

l{R}

(all real numbers). An associated decision rule for estimating

\theta

from some observed data might be, "choose the value of the

\theta

, say

\hat{\theta}

, that minimizes the sum of squared error between some observed responses and responses predicted from the corresponding covariates given that you chose

\hat{\theta}

." Thus, the cost function is the sum of squared error, and one would aim to minimize this cost. Once the cost function is defined,

\hat{\theta}

could be chosen, for instance, using some optimization algorithm.

See also