November 22, 2024

Introduction to (Direct) Lyapunov Stability Analysis With Examples


In this control engineering and control theory tutorial, we provide a brief introduction to Lyapunov stability analysis. The YouTube tutorial accompanying this tutorial is given below.

Motivation for Lyapunov Stability Analysis

Consider the following nonlinear dynamics

(1)   \begin{align*}\dot{\mathbf{x}}=\mathbf{f}(\mathbf{x})\end{align*}

where \mathbf{x}\in \mathbb{R}^{n} is a state vector, and \mathbf{f} is a (nonlinear) vector function. Let \mathbf{x}^{*}=0 be an equilibrium point of (1). Here for simplicity, we assumed that 0 is an equilibrium point. However, everything stated in this tutorial can be generalized to the case of non-zero equilibrium points. This can be achieved with a simple change of state-space variables.

We want to provide answers to the following questions

  1. Is the equilibrium point \mathbf{x}^{*}=0 stable?
  2. Is the equilibrium point \mathbf{x}^{*}=0 asymptotically stable?

In the case of linear systems, these questions can easily be answered by computing the eigenvalues of the linear system. However, in the case of nonlinear systems, these questions are far from trivial.

There are at least two methods that we can use to answer these questions:

  1. Lyapunov’s indirect method (also known as Lyapunov’s first method). The main idea of this approach is to linearize the dynamics of the nonlinear system around the equilibrium point and to investigate the stability of the linearized dynamics. If the linearized dynamics is asymptotically stable, then the equilibrium point of the nonlinear system is also asymptotically stable. If the linearized dynamics is not stable, then the equilibrium point of the nonlinear system is also not stable.
  2. Lyapunov’s direct method (also known as Lyapunov’s second method). This approach is based on finding a scalar function of a state that satisfies certain properties. Namely, this function has to be continuously differentiable and positive definite. If the first derivative of this function with respect to time is negative semidefinite along the state trajectories, then we can conclude that the equilibrium point is stable. On the other hand, if the first derivative along state trajectories is negative definite then the equilibrium point is asymptotically stable.

This tutorial is dedicated to the practical application of Lyapunov’s direct method. Before reading this tutorial it is a very good idea to revise the concept of positive (negative) definite functions. These concepts are explained in our previous tutorial, which can be found here.

Formal Statment of the Lyapunov Theorem for the Local Stability of Equilibrium Point

Before we introduce examples, it is very important to state the Lyapunov local stability theorem for an equilibrium point of a dynamical system. This theorem is stated in a number of standard textbooks on nonlinear systems, such as:

  1. Nonlinear Systems, by Hassan K. Khalil (edition from 1992, page 101.)
  2. Applied Nonlinear Control, by Jean-Jacques E. Slotine and Weiping Li (edition from 1991, page 62.)

Lyapunov (local) Stability Theorem: Consider a dynamical system

(2)   \begin{align*}\dot{\mathbf{x}}=\mathbf{f}(\mathbf{x})\end{align*}

where \mathbf{x}\in \mathbb{R}^{n} is a state vector, and \mathbf{f} is a (nonlinear) vector function. Let \mathbf{x}^{*}=0 be an equilibrium point of (2). Let us assume that we are able to find a continuously differentiable scalar function of the state vector argument V(\mathbf{x}), defined on a neighborhood set D of \mathbf{x}^{*}, that satisfies the following two conditions:

  1. The function V(\mathbf{x}) is (locally) positive definite in D. That is, V(0)=0 and V(\mathbf{x})>0 in D-\{0\} for all \mathbf{x} in D.
  2. The time derivative of V(\mathbf{x}) along state trajectories of the system is (locally) negative semi-definite in D. That is, \dot{V}(\mathbf{x})\le 0 for all \mathbf{x} in D.

Then, \mathbf{x}^{*}=0 is stable.

In addition, if the first time derivative of V(\mathbf{x}) is (locally) negative definite in D-\{0\} along any state trajectory of the system. That is, \dot{V}(\mathbf{x}) < 0 in D-\{0\}, then \mathbf{x}^{*}=0 is asymptotically stable.

In this brief tutorial, we give an intuitive explanation of Lyapunov’s stability theorem for one-dimensional system. This example is also explained in the accompanying video tutorial.

A positive definite function is shown below.

The function V(x) that satisfied the conditions of this theorem is called the Lyapunov function.

First Example of Stability Analysis

We consider the pendulum system shown in the figure below.

In this figure, \theta is the rotation angle, l is the length of the rod, m is the mass of the ball, g is the gravitational acceleration constant, and F is the control force (external input). In our previous tutorial, which can be found here, we derived the equation of motion of this system. The equation has the following form

(3)   \begin{align*}\ddot{\theta}+\frac{g}{l}\sin(\theta)=F\end{align*}

For stability analysis, we always assume that the control force F is equal to zero. That is, we study the dynamical behavior of the system without any external control actions. Under this assumption, the dynamics is given by the following equation

(4)   \begin{align*}\ddot{\theta}+\frac{g}{l}\sin(\theta)=0\end{align*}

Next, let us transform this equation into the state-space form by assigning the state variables

(5)   \begin{align*}x_{1}=\theta, \;\; x_{2}=\dot{\theta} \end{align*}

The state-space model has the following form

(6)   \begin{align*}\begin{bmatrix}\dot{x}_{1} \\ \dot{x}_{2} \end{bmatrix}=\begin{bmatrix} x_{2}  \\ -\frac{g}{l}\sin(x_{1})  \end{bmatrix}\end{align*}

Next, let us simulate the state trajectories of such a system. In our previous tutorial, which can be found here, we explained how to simulate state trajectories in MATLAB. By entering the system dynamics (6) in the developed MATLAB codes, we obtain a phase portrait of the system. The phase portrait is given below. It is generated for the initial conditions x_{1}(0)=\pi/6 and x_{2}(0)=0.

Figure 2: Phase portrait of the system. The circle presents the initial state, and the square is the state after 1.5 seconds.

As can be observed from the figure above, the equilibrium point is stable, however, it is NOT asymptotically stable.

Let us now construct a Lyapunov function that will be used to formally prove that the equilibrium point is asymptotically stable. For that purpose, let us consider the figure shown below

A good initial choice of the Lyapunov function is the total energy of the system. This is because if the system is asymptotically stable, then the energy of the system should decrease over time or if the system is stable, then the total energy is either constant or bounded (not increasing). Consequently, let us define the total energy of the system. The total energy is a sum of potential and kinetic energy:

(7)   \begin{align*}V(\mathbf{x})=E_{k}+E_{p}=\frac{1}{2}m \cdot v^{2}+mgh\end{align*}

where v is the velocity of the mass, and h is the distance of the mass from the reference plane for calculating the potential energy. Since v=l\dot{\theta}= lx_{2} and h=l-l\cos(\theta)=l-l\cos(x_{1}), the total energy becomes:

(8)   \begin{align*}V(\mathbf{x})=\frac{1}{2}m  l^{2}x_{2}^{2}+mg\big( l-l\cos(x_{1}) \big)=\frac{1}{2}ml^2 x_{2}^{2}+mgl\big(1-\cos(x_{1}) \big)\end{align*}

This function should be positive definite and V(0)=0. Since \cos (0)=1 and the kinetic energy is zero for x_{2}=0, it is obvious that V(0)=0. Then, let us investigate the positive definiteness of this function. Obviously, the kinetic energy is always positive for x_{2}\ne 0. Then, the potential energy is always positive for x_{1}\ne s\pi, for s=0, \pm 2  \pm 4, \ldots. Consequently, we conclude that the function (8) is positive definite over the domain defined by

(9)   \begin{align*}-2\pi < x_{1} < 2 \pi,\;\; -\infty <x_{2}<\infty\end{align*}

Next, let us compute the derivative of the function V(\mathbf{x}) along the state trajectories of the system:

(10)   \begin{align*}\dot{V}(\mathbf{x}) &  = \frac{dV(\mathbf{x})}{dt}= \frac{1}{2} 2 ml^{2}x_{2}\dot{x}_{2}+mgl\sin(x_{1})\dot{x}_{1} \\&= ml^{2}x_{2} \dot{x}_{2}+mgl\sin(x_{1})\dot{x}_{1} \end{align*}

Next, we substitute \dot{x}_{1} and \dot{x}_{2} from (6) in (10). As the result, we obtain

(11)   \begin{align*}\dot{V}(\mathbf{x}) & = ml^{2}x_{2} \dot{x}_{2}+mgl\sin(x_{1})\dot{x}_{1} \\& = ml^{2}x_{2}( -\frac{g}{l}\sin(x_{1})) +mgl\sin(x_{1})x_{2} \\& = - ml^{2}\frac{g}{l}\sin(x_{1})x_{2}+mgl\sin(x_{1})x_{2} \\& = - mgl\sin(x_{1})x_{2}+mgl\sin(x_{1})x_{2} =0\end{align*}

Consequently, according to the Lyapunov local stability theorem, the equilibrium point \mathbf{x}^{*}=0 is stable.

The physical interpretation of this result is that the energy stays constant along the trajectories of the system! This is actually the law of energy conservation. That is, the potential energy transforms into the kinetic energy and vice-versa. However, the total energy remains constant!