In mathematics, constraint counting is counting the number of constraints in order to compare it with the number of variables, parameters, etc. that are free to be determined, the idea being that in most cases the number of independent choices that can be made is the excess of the latter over the former.
For example, in linear algebra if the number of constraints (independent equations) in a system of linear equations equals the number of unknowns then precisely one solution exists; if there are fewer independent equations than unknowns, an infinite number of solutions exist; and if the number of independent equations exceeds the number of unknowns, then no solutions exist.
In the context of partial differential equations, constraint counting is a crude but often useful way of counting the number of free functions needed to specify a solution to a partial differential equation.
Consider a second order partial differential equation in three variables, such as the two-dimensional wave equation
utt=uxx+uyy.
u(t,x,y)
u
utyt=utty=uxxy+uyyy
To answer this in the important special case of a linear partial differential equation, Einstein asked: how many of the partial derivatives of a solution can be linearly independent? It is convenient to record his answer using an ordinary generating function
s(\xi)=
infty | |
\sum | |
k=0 |
sk\xik
sk
Whenever a function satisfies some partial differential equation, we can use the corresponding rewrite rule to eliminate some of them, because further mixed partials have necessarily become linearly dependent. Specifically, the power series counting the variety of arbitrary functions of three variables (no constraints) is
f(\xi)=
1 | |
(1-\xi)3 |
=1+3\xi+6\xi2+10\xi3+...
g(\xi)=
1-\xi2 | |
(1-\xi)3 |
=1+2\xi+5\xi2+7\xi3+...
utt
uttt,uttx,utty
More generally, the o.g.f. for an arbitrary function of n variables is
s[n](\xi)=1/(1-\xi)n=1+n\xi+\left(\begin{matrix}n\ 2\end{matrix}\right)\xi2+\left(\begin{matrix}n+1\ 3\end{matrix}\right)\xi3+...
g(\xi)=
1-\xim | |
(1-\xi)n |
Next,
1-\xi2 | |
(1-\xi)3 |
=
1+\xi | |
(1-\xi)2 |
To verify this prediction, recall the solution of the initial value problem
utt=uxx+uyy, u(0,x,y)=p(x,y), ut(0,x,y)=q(x,y)
u(t,x,y)\mapsto[Lu](\omega,x,y)
-\omega2[Lu]+\omegap(x,y)+q(x,y)+[Lu]x+[Lu]y
[Lu](\omega,x,y)\mapsto[FLU](\omega,m,n)
-\omega2[FLu]+\omega[Fp]+[Fq]-(m2+n2)[FLu]
[FLu](\omega,m,n)=
\omega[Fp](m,n)+[Fq](m,n) | |
\omega2+m2+n2 |
[Fu](t,m,n)=[Fp](m,n)\cos(\sqrt{m2+n2}t)+
[Fq](m,n)\sin(\sqrt{m2+n2 | |
t)}{\sqrt{m2+n2}}
u(t,x,y)=Q(t,x,y)+Pt(t,x,y)
P(t,x,y)=
1 | |
2\pi |
\int | |
(x-x')2+(y-y')2<t2 |
p(x',y')dx'dy' | |
\left[t2-(x-x')2-(y-y')2\right]1/2 |
Q(t,x,y)=
1 | |
2\pi |
\int | |
(x-x')2+(y-y')2<t2 |
q(x',y')dx'dy' | |
\left[t2-(x-x')2-(y-y')2\right]1/2 |
In the case of a nonlinear equation, it will only rarely be possible to obtain the general solution in closed form. However, if the equation is quasilinear (linear in the highest order derivatives), then we can still obtain approximate information similar to the above: specifying a member of the solution space will be "modulo nonlinear quibbles" equivalent to specifying a certain number of functions in a smaller number of variables. The number of these functions is the Einstein strength of the p.d.e. In the simple example above, the strength is two, although in this case we were able to obtain more precise information.