In calculus, the derivative of any linear combination of functions equals the same linear combination of the derivatives of the functions;[1] this property is known as linearity of differentiation, the rule of linearity,[2] or the superposition rule for differentiation.[3] It is a fundamental property of the derivative that encapsulates in a single rule two simpler rules of differentiation, the sum rule (the derivative of the sum of two functions is the sum of the derivatives) and the constant factor rule (the derivative of a constant multiple of a function is the same constant multiple of the derivative).[4] [5] Thus it can be said that differentiation is linear, or the differential operator is a linear operator.[6]
Let and be functions, with and constants. Now consider
d | |
dx |
(\alpha ⋅ f(x)+\beta ⋅ g(x)).
By the sum rule in differentiation, this is
d | |
dx |
(\alpha ⋅ f(x))+
d | |
dx |
(\beta ⋅ g(x)),
and by the constant factor rule in differentiation, this reduces to
\alpha ⋅ f'(x)+\beta ⋅ g'(x).
Therefore,
d | |
dx |
(\alpha ⋅ f(x)+\beta ⋅ g(x))=\alpha ⋅ f'(x)+\beta ⋅ g'(x).
Omitting the brackets, this is often written as:
(\alpha ⋅ f+\beta ⋅ g)'=\alpha ⋅ f'+\beta ⋅ g'.
We can prove the entire linearity principle at once, or, we can prove the individual steps (of constant factor and adding) individually. Here, both will be shown.
Proving linearity directly also proves the constant factor rule, the sum rule, and the difference rule as special cases. The sum rule is obtained by setting both constant coefficients to
1
1
-1
0
0
On the contrary, if we first prove the constant factor rule and the sum rule, we can prove linearity and the difference rule. Proving linearity is done by defining the first and second functions as being two other functions being multiplied by constant coefficients. Then, as shown in the derivation from the previous section, we can first use the sum law while differentiation, and then use the constant factor rule, which will reach our conclusion for linearity. In order to prove the difference rule, the second function can be redefined as another function multiplied by the constant coefficient of
-1
In the proofs/derivations below,[7] [8] the coefficients
a,b
\alpha,\beta
Let
a,b\inR
f,g
j
j
f
g
j
f
g
x
j
j(x)=af(x)+bg(x)
We want to prove that
j\prime(x)=af\prime(x)+bg\prime(x)
By definition, we can see that
In order to use the limits law for the sum of limits, we need to know that and both individually exist. For these smaller limits, we need to know that and both individually exist to use the coefficient law for limits. By definition, and . So, if we know that
f\prime(x)
g\prime(x)
and
With this, we can go back to apply the limit law for the sum of limits, since we know that and both individually exist. From here, we can directly go back to the derivative we were working on.Finally, we have shown what we claimed in the beginning:
j\prime(x)=af\prime(x)+bg\prime(x)
Let
f,g
j
j
f
g
j
f
g
x
j
j(x)=f(x)+g(x)
We want to prove that
j\prime(x)=f\prime(x)+g\prime(x)
By definition, we can see that
In order to use the law for the sum of limits here, we need to show that the individual limits, and both exist. By definition, and , so the limits exist whenever the derivatives
f\prime(x)
g\prime(x)
Thus, we have shown what we wanted to show, that:
j\prime(x)=f\prime(x)+g\prime(x)
Let
f,g
j
j
f
g
j
f
g
x
j
j(x)=f(x)-g(x)
We want to prove that
j\prime(x)=f\prime(x)-g\prime(x)
By definition, we can see that:
In order to use the law for the difference of limits here, we need to show that the individual limits, and both exist. By definition, and that , so these limits exist whenever the derivatives
f\prime(x)
g\prime(x)
Thus, we have shown what we wanted to show, that:
j\prime(x)=f\prime(x)-g\prime(x)
Let
f
a\inR
a
j
f
j
f
x
j
j(x)=af(x)
We want to prove that
j\prime(x)=af\prime(x)
By definition, we can see that:
Now, in order to use a limit law for constant coefficients to show that
we need to show that exists.However, , by the definition of the derivative. So, if
f\prime(x)
Thus, if we assume that
f\prime(x)
Thus, we have proven that when
j(x)=af(x)
j\prime(x)=af\prime(x)