Orthonormal basis explained
with finite
dimension is a
basis for
whose vectors are
orthonormal, that is, they are all
unit vectors and
orthogonal to each other.
[1] [2] [3] For example, the
standard basis for a
Euclidean space
is an orthonormal basis, where the relevant inner product is the
dot product of vectors. The
image of the standard basis under a
rotation or
reflection (or any
orthogonal transformation) is also orthonormal, and every orthonormal basis for
arises in this fashion.
For a general inner product space
an orthonormal basis can be used to define normalized
orthogonal coordinates on
Under these coordinates, the inner product becomes a dot product of vectors. Thus the presence of an orthonormal basis reduces the study of a
finite-dimensional inner product space to the study of
under the dot product. Every finite-dimensional inner product space has an orthonormal basis, which may be obtained from an arbitrary basis using the
Gram–Schmidt process.
In functional analysis, the concept of an orthonormal basis can be generalized to arbitrary (infinite-dimensional) inner product spaces.[4] Given a pre-Hilbert space
an
orthonormal basis for
is an orthonormal set of vectors with the property that every vector in
can be written as an infinite linear combination of the vectors in the basis. In this case, the orthonormal basis is sometimes called a
Hilbert basis for
Note that an orthonormal basis in this sense is not generally a Hamel basis, since infinite linear combinations are required. Specifically, the
linear span of the basis must be
dense in
although not necessarily the entire space.
If we go on to Hilbert spaces, a non-orthonormal set of vectors having the same linear span as an orthonormal basis may not be a basis at all. For instance, any square-integrable function on the interval
can be expressed (
almost everywhere) as an infinite sum of
Legendre polynomials (an orthonormal basis), but not necessarily as an infinite sum of the
monomials
A different generalisation is to pseudo-inner product spaces, finite-dimensional vector spaces
equipped with a non-degenerate
symmetric bilinear form known as the
metric tensor. In such a basis, the metric takes the form
diag(+1, … ,+1,-1, … ,-1)
with
positive ones and
negative ones.
Examples
, the set of vectors
\left\{
=
\begin{pmatrix}1&0&0\end{pmatrix}
=
\begin{pmatrix}0&1&0\end{pmatrix}
=
\begin{pmatrix}0&0&1\end{pmatrix}
\right\},
is called the
standard basis and forms an orthonormal basis of
with respect to the standard dot product. Note that both the standard basis and standard dot product rely on viewing
as the
Cartesian product
Proof: A straightforward computation shows that the inner products of these vectors equals zero,
\left\langle
\right\rangle=\left\langle
\right\rangle=\left\langle
\right\rangle=0
and that each of their magnitudes equals one,
This means that
is an orthonormal set. All vectors
can be expressed as a sum of the basis vectors scaled
so
spans
and hence must be a basis. It may also be shown that the standard basis rotated about an axis through the origin or reflected in a plane through the origin also forms an orthonormal basis of
.
, the standard basis and inner product are similarly defined. Any other orthonormal basis is related to the standard basis by an
orthogonal transformation in the group O(n).
, an orthogonal basis
with metric
instead satisfies
if
,
if
, and
if
. Any two orthonormal bases are related by a pseudo-orthogonal transformation. In the case
, these are Lorentz transformations.
with
where
denotes the
exponential function, forms an orthonormal basis of the space of functions with finite Lebesgue integrals,
with respect to the
2-norm. This is fundamental to the study of
Fourier series.
with
if
and
otherwise forms an orthonormal basis of
Basic formula
If
is an orthogonal basis of
then every element
may be written as
When
is orthonormal, this simplifies to
and the square of the
norm of
can be given by
Even if
is
uncountable, only countably many terms in this sum will be non-zero, and the expression is therefore well-defined. This sum is also called the
Fourier expansion of
and the formula is usually known as
Parseval's identity.
If
is an orthonormal basis of
then
is
isomorphic to
in the following sense: there exists a
bijective linear map
such that
Incomplete orthogonal sets
Given a Hilbert space
and a set
of mutually orthogonal vectors in
we can take the smallest closed linear subspace
of
containing
Then
will be an orthogonal basis of
which may of course be smaller than
itself, being an
incomplete orthogonal set, or be
when it is a
complete orthogonal set.
Existence
Using Zorn's lemma and the Gram–Schmidt process (or more simply well-ordering and transfinite recursion), one can show that every Hilbert space admits an orthonormal basis;[5] furthermore, any two orthonormal bases of the same space have the same cardinality (this can be proven in a manner akin to that of the proof of the usual dimension theorem for vector spaces, with separate cases depending on whether the larger basis candidate is countable or not). A Hilbert space is separable if and only if it admits a countable orthonormal basis. (One can prove this last statement without using the axiom of choice. However, one would have to use the axiom of countable choice.)
Choice of basis as a choice of isomorphism
For concreteness we discuss orthonormal bases for a real,
-dimensional vector space
with a positive definite symmetric bilinear form
\phi=\langle ⋅ , ⋅ \rangle
.
One way to view an orthonormal basis with respect to
is as a set of vectors
, which allow us to write
, and
or
. With respect to this basis, the components of
are particularly simple:
(where
is the
Kronecker delta).
We can now view the basis as a map
which is an isomorphism of inner product spaces: to make this more explicit we can write
Explicitly we can write
where
is the dual basis element to
.
The inverse is a component map
These definitions make it manifest that there is a bijection
\{Spaceoforthogonalbasesl{B}\}\leftrightarrow\{SpaceofisomorphismsV\leftrightarrowRn\}.
The space of isomorphisms admits actions of orthogonal groups at either the
side or the
side. For concreteness we fix the isomorphisms to point in the direction
, and consider the space of such maps,
.
This space admits a left action by the group of isometries of
, that is,
such that
\phi( ⋅ , ⋅ )=\phi(R ⋅ ,R ⋅ )
, with the action given by composition:
This space also admits a right action by the group of isometries of
, that is,
Rij\inO(n)\subsetMatn x (R)
, with the action again given by composition:
.
As a principal homogeneous space
See main article: Stiefel manifold.
The set of orthonormal bases for
with the standard inner product is a
principal homogeneous space or G-torsor for the
orthogonal group
and is called the
Stiefel manifold
of orthonormal
-frames.
[6] In other words, the space of orthonormal bases is like the orthogonal group, but without a choice of base point: given the space of orthonormal bases, there is no natural choice of orthonormal basis, but once one is given one, there is a one-to-one correspondence between bases and the orthogonal group.Concretely, a linear map is determined by where it sends a given basis: just as an invertible map can take any basis to any other basis, an orthogonal map can take any orthogonal basis to any other orthogonal basis.
The other Stiefel manifolds
for
of
incomplete orthonormal bases (orthonormal
-frames) are still homogeneous spaces for the orthogonal group, but not
principal homogeneous spaces: any
-frame can be taken to any other
-frame by an orthogonal map, but this map is not uniquely determined.
- The set of orthonormal bases for
is a G-torsor for
.
- The set of orthonormal bases for
is a G-torsor for
.
- The set of orthonormal bases for
is a G-torsor for
.
- The set of right-handed orthonormal bases for
is a G-torsor for
References
- Book: Roman
, Stephen
. Advanced Linear Algebra . Third . . Springer . 2008 . 978-0-387-72828-5 . Steven Roman. (page 218, ch.9)
External links
- This Stack Exchange Post discusses why the set of Dirac Delta functions is not a basis of L2([0,1]).
Notes and References
- Book: Lay, David C.. Linear Algebra and Its Applications. registration. Addison–Wesley. 2006. 3rd. 0-321-28713-4.
- Book: Strang, Gilbert. Gilbert Strang. Linear Algebra and Its Applications. Brooks Cole. 2006. 4th. 0-03-010567-6.
- Book: Axler, Sheldon. Linear Algebra Done Right. Springer. 2002. 2nd. 0-387-98258-2.
- Book: Rudin, Walter. Walter Rudin. Real & Complex Analysis. McGraw-Hill. 1987. 0-07-054234-1.
- https://books.google.com/books?id=-m3jBwAAQBAJ Linear Functional Analysis
- Web site: CU Faculty. 2021-04-15. engfac.cooper.edu.