Observability Gramian Explained

In control theory, we may need to find out whether or not a system such as

\begin{array}{c} \boldsymbol{x
}(t)\boldsymbol(t)+\boldsymbol(t)\\\boldsymbol(t)=\boldsymbol(t)+\boldsymbol(t)\end

is observable, where

\boldsymbol{A}

,

\boldsymbol{B}

,

\boldsymbol{C}

and

\boldsymbol{D}

are, respectively,

n x n

,

n x p

,

q x n

and

q x p

matrices.

One of the many ways one can achieve such goal is by the use of the Observability Gramian.

Observability in LTI Systems

Linear Time Invariant (LTI) Systems are those systems in which the parameters

\boldsymbol{A}

,

\boldsymbol{B}

,

\boldsymbol{C}

and

\boldsymbol{D}

are invariant with respect to time.

One can determine if the LTI system is or is not observable simply by looking at the pair

(\boldsymbol{A},\boldsymbol{C})

. Then, we can say that the following statements are equivalent:

1. The pair

(\boldsymbol{A},\boldsymbol{C})

is observable.

2. The

n x n

matrix

\boldsymbol{Wo

}(t)=\int_^e^\boldsymbol^\boldsymbole^d\tau

is nonsingular for any

t>0

.

3. The

nq x n

observability matrix

\left[\begin{array}{c} \boldsymbol{C}\\ \boldsymbol{CA}\\ \boldsymbol{CA}2\\ \vdots\\ \boldsymbol{CA}n-1\end{array}\right]

has rank n.

4. The

(n+q) x n

matrix

\left[\begin{array}{c} \boldsymbol{A}\boldsymbol{}\boldsymbol{I}\\ \boldsymbol{C} \end{array}\right]

has full column rank at every eigenvalue

λ

of

\boldsymbol{A}

.

If, in addition, all eigenvalues of

\boldsymbol{A}

have negative real parts (

\boldsymbol{A}

is stable) and the unique solution of

\boldsymbol{AT

}\boldsymbol_+\boldsymbol_\boldsymbol=-\boldsymbol

is positive definite, then the system is observable. The solution is called the Observability Gramian and can be expressed as

\boldsymbol{Wo

}=\int_^e^\boldsymbole^d\tau

In the following section we are going to take a closer look at the Observability Gramian.

Observability Gramian

The Observability Gramian can be found as the solution of the Lyapunov equation given by

\boldsymbol{AT

}\boldsymbol_+\boldsymbol_\boldsymbol=-\boldsymbol

In fact, we can see that if we take

\boldsymbol{Wo

}=\int_^e^\boldsymbole^d\tau

as a solution, we are going to find that:

\begin{array}{ccccc} \boldsymbol{AT

}\boldsymbol_+\boldsymbol_\boldsymbol & = & \int_^\boldsymbole^\boldsymbole^d\tau & + & \int_^e^\boldsymbole^\boldsymbold\tau\\ & = & \int_^\frac(e^\boldsymbol^\boldsymbole^)d\tau & = & e^\boldsymbol^\boldsymbole^|_^\\ & = & \boldsymbol-\boldsymbol\\ & = & \boldsymbol\end

Where we used the fact that

e\boldsymbol{At}=0

at

t=infty

for stable

\boldsymbol{A}

(all its eigenvalues have negative real part). This shows us that

\boldsymbol{W}o

is indeed the solution for the Lyapunov equation under analysis.

Properties

We can see that

\boldsymbol{CTC}

is a symmetric matrix, therefore, so is

\boldsymbol{W}o

.

We can use again the fact that, if

\boldsymbol{A}

is stable (all its eigenvalues have negative real part) to show that

\boldsymbol{W}o

is unique. In order to prove so, suppose we have two different solutions for

\boldsymbol{AT

}\boldsymbol_+\boldsymbol_\boldsymbol=-\boldsymbol

and they are given by

\boldsymbol{W}o1

and

\boldsymbol{W}o2

. Then we have:

\boldsymbol{AT

}\boldsymbol_-\boldsymbol_)+\boldsymbol_-\boldsymbol_)\boldsymbol=\boldsymbol

Multiplying by

\boldsymbol{AT
e

t}

by the left and by

e\boldsymbol{At}

by the right, would lead us to
\boldsymbol{AT
e

t}[\boldsymbol{AT

}\boldsymbol_-\boldsymbol_)+\boldsymbol_-\boldsymbol_)\boldsymbol]e^=\frac[e^{\boldsymbol{A^{T}}t}[(\boldsymbol{W}_{o1}-\boldsymbol{W}_{o2})e^{\boldsymbol{A}t}]=\boldsymbol

Integrating from

0

to

infty

:
\boldsymbol{AT
[e

t}[(\boldsymbol{W}o1-\boldsymbol{W}o2)e\boldsymbol{A

infty
t}]|
t=0

=\boldsymbol{0}

using the fact that

e\boldsymbol{At} → 0

as

t → infty

:

\boldsymbol{0}-(\boldsymbol{W}o1-\boldsymbol{W}o2)=\boldsymbol{0}

In other words,

\boldsymbol{W}o

has to be unique.

Also, we can see that

\boldsymbol{xTWo

infty
x}=\int
0

\boldsymbol{x}T

\boldsymbol{AT
e

t}\boldsymbol{CTC}e\boldsymbol{A

infty
t}\boldsymbol{x}dt=\int
0

\left\Vert\boldsymbol{Ce\boldsymbol{At}\boldsymbol{x}}\right\Vert

2
2

dt

is positive for any

\boldsymbol{x}

(assuming the non-degenerate case where

{\displaystyle{\boldsymbol{Ce{\boldsymbol

}t}}} } is not identically zero), and that makes

\boldsymbol{W}o

a positive definite matrix.

More properties of observable systems can be found in,[1] as well as the proof for the other equivalent statements of "The pair

(\boldsymbol{A},\boldsymbol{C})

is observable" presented in section Observability in LTI Systems.

Discrete Time Systems

For discrete time systems as

\begin{array}{c} \boldsymbol{x}[k+1]\boldsymbol{=Ax}[k]+\boldsymbol{Bu}[k]\\ \boldsymbol{y}[k]=\boldsymbol{Cx}[k]+\boldsymbol{Du}[k] \end{array}

One can check that there are equivalences for the statement "The pair

(\boldsymbol{A},\boldsymbol{C})

is observable" (the equivalences are much alike for the continuous time case).

We are interested in the equivalence that claims that, if "The pair

(\boldsymbol{A},\boldsymbol{C})

is observable" and all the eigenvalues of

\boldsymbol{A}

have magnitude less than

1

(

\boldsymbol{A}

is stable), then the unique solution of

\boldsymbol{AT

}\boldsymbol_\boldsymbol-W_=-\boldsymbol

is positive definite and given by

\boldsymbol{W}do

infty
=\sum
m=0

(\boldsymbol{A}T)m\boldsymbol{C}T\boldsymbol{C}\boldsymbol{A}m

That is called the discrete Observability Gramian. We can easily see the correspondence between discrete time and the continuous time case, that is, if we can check that

\boldsymbol{W}dc

is positive definite, and all eigenvalues of

\boldsymbol{A}

have magnitude less than

1

, the system

(\boldsymbol{A},\boldsymbol{B})

is observable. More properties and proofs can be found in.[2]

Linear Time Variant Systems

Linear time variant (LTV) systems are those in the form:

\begin{array}{c} \boldsymbol{x
}(t)\boldsymbol(t)\boldsymbol(t)+\boldsymbol(t)\boldsymbol(t)\\\boldsymbol(t)=\boldsymbol(t)\boldsymbol(t)\end

That is, the matrices

\boldsymbol{A}

,

\boldsymbol{B}

and

\boldsymbol{C}

have entries that varies with time. Again, as well as in the continuous time case and in the discrete time case, one may be interested in discovering if the system given by the pair

(\boldsymbol{A}(t),\boldsymbol{C}(t))

is observable or not. This can be done in a very similar way of the preceding cases.

The system

(\boldsymbol{A}(t),\boldsymbol{C}(t))

is observable at time

t0

if and only if there exists a finite

t1>t0

such that the

n x n

matrix also called the Observability Gramian is given by

\boldsymbol{W}o(t0,t1

t1
)=\int
t0

\boldsymbol{\Phi}T(\tau,t0)\boldsymbol{C}T(\tau)\boldsymbol{C}(\tau)\boldsymbol{\Phi}(\tau,t0)d\tau

where

\boldsymbol{\Phi}(t,\tau)

is the state transition matrix of
\boldsymbol{x
}=\boldsymbol(t)\boldsymbol is nonsingular.

Again, we have a similar method to determine if a system is or not an observable system.

Properties of

\boldsymbol{W}o(t0,t1)

We have that the Observability Gramian

\boldsymbol{W}o(t0,t1)

have the following property:

\boldsymbol{W}o(t0,t1)=\boldsymbol{W}o(t0,t)+\boldsymbol{\Phi}T(t,t0)\boldsymbol{W}o(t,t0)\boldsymbol{\Phi}(t,t0)

that can easily be seen by the definition of

\boldsymbol{W}o(t0,t1)

and by the property of the state transition matrix that claims that:

\boldsymbol{\Phi}(t0,t1)=\boldsymbol{\Phi}(t1,\tau)\boldsymbol{\Phi}(\tau,t0)

More about the Observability Gramian can be found in.[3]

See also

References

  1. Book: Chen, Chi-Tsong. Linear System Theory and Design Third Edition. limited. 1999. Oxford University Press. New York, New York. 0-19-511777-8. 156.
  2. Book: Chen, Chi-Tsong. Linear System Theory and Design Third Edition. limited. 1999. Oxford University Press. New York, New York. 0-19-511777-8. 171.
  3. Book: Chen, Chi-Tsong. Linear System Theory and Design Third Edition. limited. 1999. Oxford University Press. New York, New York. 0-19-511777-8. 179.

External links