Difference between revisions of "Dynamic Behavior"

From FBSwiki
Jump to: navigation, search
(Textbook Contents)
(Chapter Summary)
Line 29: Line 29:
 
This chapter introduces the basic concepts and tools of dynamical systems.
 
This chapter introduces the basic concepts and tools of dynamical systems.
 
<ol>
 
<ol>
<li> <p> We say that <math>x(t)</math> is a solution of a differential equation on the time interval <math>t_0</math> to <math>t_f</math> with initial value <math>x_0</math>  if it satisfies
+
<li> <p> We say that <amsmath>x(t)</amsmath> is a solution of a differential equation on the time interval <amsmath>t_0</amsmath> to <amsmath>t_f</amsmath> with initial value <amsmath>x_0</amsmath>  if it satisfies
 
<center><amsmath>
 
<center><amsmath>
 
   x(t_0) = x_0 \quad\text{and}\quad  \dot x(t) = F(x(t)) \quad\text{for all}\quad t_0 \leq t \leq t_f.
 
   x(t_0) = x_0 \quad\text{and}\quad  \dot x(t) = F(x(t)) \quad\text{for all}\quad t_0 \leq t \leq t_f.
 
</amsmath></center>
 
</amsmath></center>
We will usually assume <math>t_0 = 0</math>.  For most differential equations we will encounter, there is a unique solution for a given initial condition.  Numerical tools such as MATLAB and Mathematica can be used to obtain numerical solutions for <math>x(t)</math> given the function <math>F(x)</math>.</p></li>
+
We will usually assume <amsmath>t_0 = 0</amsmath>.  For most differential equations we will encounter, there is a unique solution for a given initial condition.  Numerical tools such as MATLAB and Mathematica can be used to obtain numerical solutions for <amsmath>x(t)</amsmath> given the function <amsmath>F(x)</amsmath>.</p></li>
  
<li><p> An ''equilibrium point'' for a dynamical system represents a point <math>x_e</math> such that if <math>x(0) = x_e</math> then <math>x(t) = x_e</math> for all <math>t</math>.  Equilibrium points represent stationary conditions for the dynamics of a system.  A ''limit cycle'' for a dynamical system is a solution <math>x(t)</math> which is periodic with some period <math>T</math>, so that <math>x(t + T) = x(t)</math> for all <math>t</math>.</p></li>
+
<li><p> An ''equilibrium point'' for a dynamical system represents a point <amsmath>x_e</amsmath> such that if <amsmath>x(0) = x_e</amsmath> then <amsmath>x(t) = x_e</amsmath> for all <amsmath>t</amsmath>.  Equilibrium points represent stationary conditions for the dynamics of a system.  A ''limit cycle'' for a dynamical system is a solution <amsmath>x(t)</amsmath> which is periodic with some period <amsmath>T</amsmath>, so that <amsmath>x(t + T) = x(t)</amsmath> for all <amsmath>t</amsmath>.</p></li>
  
 
<li><p><span id=stability>An equilibrium point is (locally) ''stable'' if initial conditions that start near an equilibrium point stay near that equilibrium point.  A equilibrium point is (locally) ''asymptotically stable'' if it is stable and, in addition, the state of the system converges to the equilibrium point as time increases.  An equilibrium point is ''unstable'' if it is not stable.  Similar definitions can be used to define the stability of a limit cycle.</span></p></li>
 
<li><p><span id=stability>An equilibrium point is (locally) ''stable'' if initial conditions that start near an equilibrium point stay near that equilibrium point.  A equilibrium point is (locally) ''asymptotically stable'' if it is stable and, in addition, the state of the system converges to the equilibrium point as time increases.  An equilibrium point is ''unstable'' if it is not stable.  Similar definitions can be used to define the stability of a limit cycle.</span></p></li>
  
<li><p> Phase portraits provide a convenient way to understand the behavior of 2-dimensional dynamical systems.  A phase portrait is a graphical representation of the dynamics obtained by plotting the state <math>x(t) = (x_1(t), x_2(t))</math> in the plane.  This portrait is often augmented by plotting an arrow in the plane corresponding to <math>F(x)</math>, which shows the rate of change of the state.  The following diagrams illustrate the basic features of a dynamical systems:
+
<li><p> Phase portraits provide a convenient way to understand the behavior of 2-dimensional dynamical systems.  A phase portrait is a graphical representation of the dynamics obtained by plotting the state <amsmath>x(t) = (x_1(t), x_2(t))</amsmath> in the plane.  This portrait is often augmented by plotting an arrow in the plane corresponding to <amsmath>F(x)</amsmath>, which shows the rate of change of the state.  The following diagrams illustrate the basic features of a dynamical systems:
 
<table border=0>
 
<table border=0>
 
<tr>
 
<tr>
Line 46: Line 46:
 
<td align=center width=33%> [[Image:stablepp.png|180px]]</td>
 
<td align=center width=33%> [[Image:stablepp.png|180px]]</td>
 
</tr><tr>
 
</tr><tr>
<td align=center>An asymptotically stable equilibrium point at <math>x = (0, 0)</math>.</td>
+
<td align=center>An asymptotically stable equilibrium point at <amsmath>x = (0, 0)</amsmath>.</td>
<td align=center>A limit cycle of radius one, with an unstable equilibrium point at <math>x = (0,0)</math>.</td>
+
<td align=center>A limit cycle of radius one, with an unstable equilibrium point at <amsmath>x = (0,0)</amsmath>.</td>
<td align=center>A stable equlibirum point at <math>x = (0,0)</math> (nearby initial conditions stay nearby).</td>
+
<td align=center>A stable equlibirum point at <amsmath>x = (0,0)</amsmath> (nearby initial conditions stay nearby).</td>
 
</tr></table>
 
</tr></table>
 
</p></li>
 
</p></li>
Line 56: Line 56:
 
   \frac{dx}{dt} = A x
 
   \frac{dx}{dt} = A x
 
</amsmath></center>
 
</amsmath></center>
is asymptotically stable if and only if all eigenvalues of <math>A</math> all have strictly negative real part and is unstable if any eigenvalue of <math>A</math> has strictly positive real part.  A nonlinear system can be approximated by a linear system around an equilibrium point by using the relationship
+
is asymptotically stable if and only if all eigenvalues of <amsmath>A</amsmath> all have strictly negative real part and is unstable if any eigenvalue of <amsmath>A</amsmath> has strictly positive real part.  A nonlinear system can be approximated by a linear system around an equilibrium point by using the relationship
 
<center><amsmath>
 
<center><amsmath>
 
   \dot x = F(x_e) + \left.\frac{\partial F}{\partial x}\right|_{x_e} (x - x_e) +
 
   \dot x = F(x_e) + \left.\frac{\partial F}{\partial x}\right|_{x_e} (x - x_e) +
 
   \text{higher order terms in $(x-x_e)$}.
 
   \text{higher order terms in $(x-x_e)$}.
 
</amsmath></center>
 
</amsmath></center>
Since <math>F(x_e) = 0</math>, we can approximate the system by choosing a new
+
Since <amsmath>F(x_e) = 0</amsmath>, we can approximate the system by choosing a new
state variable <math>z = x - x_e</math> and writing the dynamics as <amsmath>\dot z = A z</amsmath>.  The stability of the nonlinear system can be determined in a local neighborhood of the equilibrium point through its linearization.
+
state variable <amsmath>z = x - x_e</amsmath> and writing the dynamics as <amsmath>\dot z = A z</amsmath>.  The stability of the nonlinear system can be determined in a local neighborhood of the equilibrium point through its linearization.
 
</p></li>
 
</p></li>
  
<li><p>A ''Lyapunov function'' is an energy-like function <amsmath>V:R^n \to R</amsmath> that can be used to reason about the stability of an equilibrium point.  We define the derivative of <math>V</math> along the trajectory of the system as
+
<li><p>A ''Lyapunov function'' is an energy-like function <amsmath>V:R^n \to R</amsmath> that can be used to reason about the stability of an equilibrium point.  We define the derivative of <amsmath>V</amsmath> along the trajectory of the system as
 
<center><amsmath>
 
<center><amsmath>
 
\dot V(x) = \frac{\partial V}{\partial x} \dot x =  \frac{\partial V}{\partial x} F(x)
 
\dot V(x) = \frac{\partial V}{\partial x} \dot x =  \frac{\partial V}{\partial x} F(x)
 
</amsmath></center>
 
</amsmath></center>
Assuming <math>x_e = 0</math> and <math>V(0) = 0</math>, the following conditions hold:
+
Assuming <amsmath>x_e = 0</amsmath> and <amsmath>V(0) = 0</amsmath>, the following conditions hold:
 
<center>
 
<center>
 
{| border=1
 
{| border=1
Line 76: Line 76:
 
|-
 
|-
 
| align=center | <amsmath> V(x) > 0, x \neq 0</amsmath>  
 
| align=center | <amsmath> V(x) > 0, x \neq 0</amsmath>  
| align=center | <amsmath>\dot V(x) \leq 0</amsmath> for all <math>x</math>
+
| align=center | <amsmath>\dot V(x) \leq 0</amsmath> for all <amsmath>x</amsmath>
| align=left | <math>x_e</math> stable
+
| align=left | <amsmath>x_e</amsmath> stable
 
|-
 
|-
 
| align=center | <amsmath>V(x) > 0, x \neq 0 </amsmath>  
 
| align=center | <amsmath>V(x) > 0, x \neq 0 </amsmath>  
 
| align=center | <amsmath>\dot V(x) < 0, x \neq 0</amsmath>  
 
| align=center | <amsmath>\dot V(x) < 0, x \neq 0</amsmath>  
| align=left | <math>x_e</math> asymptotically stable
+
| align=left | <amsmath>x_e</amsmath> asymptotically stable
 
|}
 
|}
 
</center>
 
</center>
Line 87: Line 87:
 
</p></li>
 
</p></li>
  
<li><p>The ''Krasovskii-LaSalle Principle'' allows one to reason about asymptotic stability even if the time derivative of <math>V</math> is only negative semi-definite (<amsmath>\leq 0</amsmath> rather than <amsmath>< 0</amsmath>).  Let <amsmath>V:R^n \to R</amsmath> be a ''positive definite function'', <math>V(x) > 0</math> for all <math> x \neq 0</math> and <math>V(0) = 0</math>, such that
+
<li><p>The ''Krasovskii-LaSalle Principle'' allows one to reason about asymptotic stability even if the time derivative of <amsmath>V</amsmath> is only negative semi-definite (<amsmath>\leq 0</amsmath> rather than <amsmath>< 0</amsmath>).  Let <amsmath>V:R^n \to R</amsmath> be a ''positive definite function'', <amsmath>V(x) > 0</amsmath> for all <amsmath> x \neq 0</amsmath> and <amsmath>V(0) = 0</amsmath>, such that
 
<center>
 
<center>
 
<amsmath>\dot V(x) \leq 0</amsmath> on the compact set <amsmath>\Omega_c = \{x \in R^n:V(x) \leq c\}</amsmath>.
 
<amsmath>\dot V(x) \leq 0</amsmath> on the compact set <amsmath>\Omega_c = \{x \in R^n:V(x) \leq c\}</amsmath>.
Line 93: Line 93:
 
Then as <amsmath>t \to \infty</amsmath>, the trajectory of the system will converge to the largest invariant set inside
 
Then as <amsmath>t \to \infty</amsmath>, the trajectory of the system will converge to the largest invariant set inside
 
<center><amsmath>S = \{x \in \Omega_c:\dot V(x) = 0\}</amsmath>.</center>
 
<center><amsmath>S = \{x \in \Omega_c:\dot V(x) = 0\}</amsmath>.</center>
In particular, if <math>S</math> contains no invariant sets other than <math>x = 0</math>, then 0 is asymptotically stable.
+
In particular, if <amsmath>S</amsmath> contains no invariant sets other than <amsmath>x = 0</amsmath>, then 0 is asymptotically stable.
 
</p></li>
 
</p></li>
  

Revision as of 20:01, 27 August 2012

Prev: Examples Chapter 4 - Dynamic Behavior Next: Linear Systems

In this chapter we give a broad discussion of the behavior of dynamical systems, focused on systems modeled by nonlinear differential equations. This allows us to discuss equilibrium points, stability, limit cycles and other key concepts of dynamical systems. We also introduce some methods for analyzing global behavior of solutions.

Textbook Contents

Dynamic Behavior (pdf, 10Aug12)

  • 1. Solving Differential Equations
  • 2. Qualitative Analysis
  • 3. Stability
  • 4. Lyapunov Stability
  • 5. Parametric and Non-Local Behavior
  • 6. Further Reading
  • Exercises

Lecture Materials

Supplemental Information

Chapter Summary

This chapter introduces the basic concepts and tools of dynamical systems.

  1. We say that math is a solution of a differential equation on the time interval math to math with initial value math if it satisfies

    math
    We will usually assume math. For most differential equations we will encounter, there is a unique solution for a given initial condition. Numerical tools such as MATLAB and Mathematica can be used to obtain numerical solutions for math given the function math.

  2. An equilibrium point for a dynamical system represents a point math such that if math then math for all math. Equilibrium points represent stationary conditions for the dynamics of a system. A limit cycle for a dynamical system is a solution math which is periodic with some period math, so that math for all math.

  3. An equilibrium point is (locally) stable if initial conditions that start near an equilibrium point stay near that equilibrium point. A equilibrium point is (locally) asymptotically stable if it is stable and, in addition, the state of the system converges to the equilibrium point as time increases. An equilibrium point is unstable if it is not stable. Similar definitions can be used to define the stability of a limit cycle.

  4. Phase portraits provide a convenient way to understand the behavior of 2-dimensional dynamical systems. A phase portrait is a graphical representation of the dynamics obtained by plotting the state math in the plane. This portrait is often augmented by plotting an arrow in the plane corresponding to math, which shows the rate of change of the state. The following diagrams illustrate the basic features of a dynamical systems:

    Doscpp.png Oscpp.png Stablepp.png
    An asymptotically stable equilibrium point at math. A limit cycle of radius one, with an unstable equilibrium point at math. A stable equlibirum point at math (nearby initial conditions stay nearby).

  5. A linear system

    math

    is asymptotically stable if and only if all eigenvalues of math all have strictly negative real part and is unstable if any eigenvalue of math has strictly positive real part. A nonlinear system can be approximated by a linear system around an equilibrium point by using the relationship

    math

    Since math, we can approximate the system by choosing a new state variable math and writing the dynamics as math. The stability of the nonlinear system can be determined in a local neighborhood of the equilibrium point through its linearization.

  6. A Lyapunov function is an energy-like function math that can be used to reason about the stability of an equilibrium point. We define the derivative of math along the trajectory of the system as

    math

    Assuming math and math, the following conditions hold:

    Condition on math Condition on math Stability
    math math for all math math stable
    math math math asymptotically stable

    Stability of limit cycles can also be studied using Lyapunov functions.

  7. The Krasovskii-LaSalle Principle allows one to reason about asymptotic stability even if the time derivative of math is only negative semi-definite (math rather than math). Let math be a positive definite function, math for all math and math, such that

    math on the compact set math.

    Then as math, the trajectory of the system will converge to the largest invariant set inside

    math.

    In particular, if math contains no invariant sets other than math, then 0 is asymptotically stable.

  8. The global behavior of a nonlinear system refers to dynamics of the system far away from equilibrium points. The region of attraction of an asymptotically stable equilirium point refers to the set of all initial conditions that converge to that equilibrium point. An equilibrium point is said to be globally asymptotically stable if all initial conditions converge to that equilibrium point. Global stability can be checked by finding a Lyapunov function that is globally positive definition with time derivative globally negative definite.

Exercises

Frequently Asked Questions

Errata

Additional small typos:

  • In Example 4.4, second simplifying assumption should be l/J_t = 1 (remove extra factor of m)
  • On page 105, line -3: On right hand side of displayed equation, x should be x_j
  • In Example 4.10, \dot V = 0 should be treated as negative semidefinite, not positive

Additional Information