Week 1 - Introduction to ODE

Differential Equation Def:

  • A differential equation is any relation involving ht derivatives of an unknown function, the function itself, and known quantities

Application

Applications of ODE include

  • Chemistry:
    • models for chemical reactions
  • Physics:
    • naturally comes with Newton’s second law, planetary motion, and much more

Applications of PDE include:

  • Physics:
    • electricity and magnetism, heat dissipation

For example:

  • Question:

image-20200909000838216

  • Solution:

    \[p(t) = Ae^{rt+b}\]

Another example would be:


For example:

  • Question:

    image-20200909001129716

  • Solution:


Goal of Studying ODE

The main goal is of course finding all solutions of a give ODE. Ideally, we would want to

  • closed form expression
    • solution containing standard functions such as $cosine$.
  • possible cases include:
    • linear equations
    • separable equations

However, what if

  • analyze its existence and uniqueness of solutions
  • study its quantitative behavior, such as steady state or run-away

Linear ordinary differential equations and the method

First consider the example of:

\[x^\prime (t) = ax(t)\]

Now consider the more general

where:

  • primitive means antiderivative

ODE Equations

ID Title Condition/Definitions Key Steps Theorem/Solution Equation                                
1 Variable coefficient and inhomogeneous: \(x'(t)=a(t)x(t)+f(t)\) RHS does not have $x(t)$:\(\frac{d}{dt}(e^{-\phi(t)}x(t))=e^{-\phi(t)}(x'(t)-\phi'(t)x(t))=e^{-\phi(t)}f(t)\) \(x(t)=e^{\phi(t)}(\int e^{-\phi(t)}f(t) dt + c)\)                                
      \(\phi=\int a(t)dt\)                                  
2 Separable $x’(t)=f(x_{(t)})g(t)$ RHS does not have $x(t)$: \(\frac{d}{dt}\phi (x) = \phi'(x)x'(t)=\frac{x'(t)}{f(x)}=g(t)\) \(x(t)=\phi^{-1}(\int g(t)dt +c)\)                                
      \(\phi'(x) = \frac{1}{f(x)}\)                                  
3 Linear System \(\vec{x}'(t) = A \vec{x}(t)\) \(e^{At} = \sum\limits_{k=0}^{\infty}\frac{t^k}{k!}A^k\) \(\vec{x}=\vec{x}_0 e^{At}\)                                
                                         
4 Euclidean and Operator Norm of Matrix Euclidean Norm: $$   A   = \sqrt{\sum\limits_{i,j=1}^{n}a_{ij}}$$   $$   A   _{op} \le   A   \le \sqrt{n}   A   _{op}$$
    Operator Norm: $$   A   _{op} = max{   A\hat{x}   ,   \hat{x}   =1}$$            
5 Existence of \(e^A\) Using a Cauchy Sequence, so that $$   S_m - S_l   \le \epsilon\(for large\)l\(and\)m \ge l$$. \(\left|\sum\limits_{k=l+1}^{\infty}\frac{1}{k!}A^k\right| \le \sum\limits_{k=l+1}^{\infty}\frac{1}{k!}\left| A^k\right|_{op} \le \sum\limits_{k=l+1}^{\infty}\frac{1}{k!}\left| A\right|_{op}^k\) Limit exists for \(e^A\)                        
6 Property of \(e^{A+B}\) If \(A\) commutes with \(B\)   \(e^{A+B}=e^A e^B\)                                
7 Alternative formula for \(e^A\)     \(e^A=\lim_{m\to \infty} (I+\frac{1}{m}A)^m\)                                
8 Derivative of \(e^{tA}\)   \(e^{hA}-I=hA+\sum\limits_{k=0}^{\infty}\frac{h^{k+2}}{(k+2)!}A^{k+2}\) \(\frac{d}{dt}e^{tA}=Ae^{tA}\)                                
9 Inhomogeneous Linear System \(\vec{x}'(t)=A\vec{x}(t)+\vec{f}(t)\)   \(\vec{x}(t)=e^{tA}\left(\int\limits_0^t e^{-sA}f(s) ds + \vec{x}_0\right)\)                                
10 Diagonal Matrix Exponentials \(e^{tD}\), \(D\) is diagonal   \(e^{tD}= \begin{bmatrix}e^{t\lambda_1} & 0 & 0 &... \\ 0 & e^{t\lambda_2} & 0 & ...\\... & ... & ... & ...\\... & ... &... &e^{t\lambda_1}\end{bmatrix}\)                                
11 Diagonalizable Matrix Exponential \(e^{tA}\), \(A=SDS^{-1}\)   \(e^{tA}=Se^{tB}S^{-1}\)                                
12 Nilpotent Matrix \(N^m=0\), \(N\) is nilpotent \(A^m\vec{x}=\lambda^m x = 0\) All eigenvalues of \(N\) is 0                                
13 LN Decomposition \(A=L+N\) for any \(A\), and \(L,N\) commute \(L\vec{x}=\lambda_i \vec{x}\), for \(\vec{x}\in ker(A-\lambda_iI)^{\nu_i}\)                                  
      \(N^{max(\nu_i)}\vec{x}=(A-\lambda_i I)^{max(\nu_i)}\vec{x}=0\), for \(\vec{x}\in ker(A-\lambda_iI)^{\nu_i}\)                                  
14 Asymptotic Behavior of Linear System \(\vec{x}'=A\vec{x}\), all $Re(\lambda)<0$ $e^{tA}\vec{x}=e^{t(\lambda_i I)}e^{t(A-\lambda_i I)}\vec{x}=e^{t(\lambda_i I)}\sum\limits_{k=0}^{\nu_i-1}\frac{t^k}{k!}(A-\lambda_i I)^k \vec{x}$ $\lim\limits_{t \to\infty} \vec{x}_0e^{tA} \to 0$                                
    $\lambda$ are the generalized eigenvalues, $\vec{x}$ in generalized eigenspace and $e^{t(\lambda_i I)}\to 0$ much faster as all $Re(\lambda)<0$                                  
15 Asymptotic Behavior of Inhomogeneous System \(\vec{x}'=A\vec{x}+f(t)\), all $Re(\lambda)<0$, and $f(t)$ has period $\tau$ Start with $\bar{x}(t_\tau = \tau)=\bar{x}(t_\tau = 0)$, get $(e^{\tau A}-I)\bar{x}_0 = \int\limits_0^\tau e^{-sA}f(s)ds$ There is a unique solution $\bar{x}(t_{\tau})$that is also period $\tau$                                
    $\lambda$ are the generalized eigenvalues, $\vec{x}$ in generalized eigenspace $\vec{x}’(t)-\bar{x}’(t_{\tau})=A(\vec{x}(t)-\bar{x}(t_{\tau}))$. Therefore: $\vec{x}(t)-\bar{x}(t_{\tau}) = e^{At}(\vec{x}(t)-\bar{x}(t_{\tau}))$, and $e^{At}\to 0$ if all $Re(\lambda)<0$ There can be other solutions $\vec{x}(t)$, but they will tend to $\bar{x}(t_{\tau})$ as $t \to \infty$                                
16 Asymptotic Behavior of Inhomogeneous System This applies for both homogenous and inhomogeneous, as you can write them all in the form of $\vec{x}=e^{At}\vec{x}_0$ same as (14), but take case 1 being $\vec{x}\in ker(p_{-}A)$, and the second case the opposite $e^{tA}\vec{x}0 \to 0$ if $\vec{x}_0 \in ker(p{-}A)$, and $t\to \infty$ and
$e^{tA}\vec{x}0 \to 0$ if $\vec{x}_0 \in ker(p{+}A)$ and $t \to -\infty$
                               
    Assumes no purely imaginary eigenvalues $ker(p_{-}A)$ is the generalized eigenspace for negative (real part) generalized eigenvalues, and vice versa                                  
17 Asymptotic Behavior of Inhomogeneous System Accounting $Re(\lambda)=0$ case Split the total space into three spaces, one for each $Re(\lambda<0)$, $Re(\lambda=0)$, $Re(\lambda>0)$ $e^{tA}\vec{x}\to 0$ if $\vec{x}\in ker(p_{-}A)=ker(A-\lambda_i I)^{\nu_i}$, for all $Re(\nu_i)<0$                                
      Do the same analysis as (16) and (14) $e^{tA}\vec{x}$ is bounded if $\vec{x}\in ker(p_{-}A)\bigoplus ker(A-\lambda_jI)$, for all $Re(\nu_j)=0$                                
18 Cauchy-Lipschitz Theorem $\vec{x}’(t)=\vec{F}(\vec{x})=\begin{bmatrix}f_1(\vec{x})\f_2(\vec{x})\…\f_n(\vec{x})\end{bmatrix}$ If we take a small time interval $\delta$ away from the initial condition at $t_0$, there $\exists$ a solution. That solution will be the convergence of Picard Iterates: $\vec{x_k}(t)=\vec{x}0+\int\limits_0^t \vec{F}(\vec{x}{k-1})dt$                                
      Then we can bound:
$\vert \vert \vec{F}(\vec{x})\vert \vert \le M$ and $\vert \vert D\vec{F}(\vec{x})\vert \vert _{op}\le L$,
where $D\vec{F}$ would be the Jacobian on $\vec{F}$
                                 
19 Convergence of Picard Iterates $\vert \vert \vec{x}_{k+1}-\vec{x}_k\vert \vert \le 2^{-k}r$ Then define $\vec{x}=\lim\limits_{k\to\infty}\vec{x}_k$ Solution can be obtained from Picard Iterates: $\vec{x}=\lim\limits_{k\to\infty}\vec{x}_k$                                
      Calculate $\vert \vert \vec{x}m-\vec{x}_l\vert \vert \le \sum\limits{k=l+1}^m \vert \vert \vec{x}{k}-\vec{x}{k-1}\vert \vert \le \sum\limits_{k=l+1}^m 2^{-k+1}r \le 2^{-l+1}r$Now send $m\to \infty$, still bounded by $2^{-l+1}r$                                  
      Therefore, $\vec{F}(\vec{x})=\lim\limits_{k\to\infty}\vec{F}(\vec{x}k)$, hence the solution $\lim\limits{k\to\infty}\vec{x}k=\lim\limits{k\to\infty}(\vec{x}0+\int\limits_0^tF(x{l-1}(s))ds)=\vec{x}_0+\int\limits_0^tF(x(s))ds)=\vec{x}$                                  
20 Uniqueness of Solution $\vec{x}’(t)=\vec{F}(\vec{x})$ Suppose there is another solution $\vec{y}$. Then bound $$   \vec{F}(\vec{x})-\vec{F}(\vec{y})   \le L   \vec{x}-\vec{y}   $$, $\vec{F}$ is Lipschitz continuous The solution $\vec{x}$ is unique if $\vec{F}$ is Lipschitz continuous, and $\vec{x}$ is continuously differentiable.                
      i.e. Lipschitz continuous function is limited in how fast it can change                                  
      Compute $$\frac{d}{dt}   \vec{x}-\vec{y}   ^2 \le 2L   \vec{x}-\vec{y}   ^2$$, hence
$\frac{d}{dt}e^{-2Lt}
  \vec{x}-\vec{y}   ^2\le0$          
      Therefore, if $\vec{x}(0)-\vec{y}(0)=0$, for all $t$, $\vec{x}-\vec{y}=0$                                  
21 Unique Maximal Domain for Solution $\begin{cases}\vec{x}’(t)=\vec{F}(\vec{x}) \ \vec{x}(0)=\vec{x}_0\end{cases}$ This complements the previous theorem (20), saying the solution will uniquely solve up to some time $\beta$ If $\beta < \infty$, then it must be that $\vert \vert \vec{x}\vert \vert \to \infty$ as $t\to \beta$                                
      Proof by contradiction that , if $\beta < \infty$, yet your solution is still bounded in a set $U =\R^n$. This cannot be because then you can extend a time $\delta$ by theorem (18).                                  
22 Definition of Flow For each point $(x_0,t_0)\in \Omega$, we have:                                    
    $\Phi(x_0,t_0)=\phi_{t_0}(x_0)=x(t_0)$
meaning a solution of the ODE starting from point $x_0$, and is defined up to time $t_0$.
                                   
23 Bounded Neighborhood Solution (Part 1) If we have another initial condition $y_0$ s.t. $\vert \vert x_0 - y_0\vert \vert \le e^{-Lt_1}r$,
where $t_1$ is the closed upper bound for maximal defined time interval
  $\vert \vert \Phi(x_0, t)-\Phi(y_0, t)\vert \vert \le 2e^{Lt}\vert \vert x_0 - y_0\vert \vert$where $0\le t\le t_1$.                                
    $\Phi(x_0, t)$ and $\Phi(y_0,t)$, namely two solutions up to the same time yet starting from a different initial condition   Also hints at possible chaotic behavior as $t\to \infty$ exploded $e^{Lt}$                                
24 Interval for Neighborhood Solution If we have another initial condition $y_0$ s.t. $\vert \vert x_0 - y_0\vert \vert \le e^{-Lt_1}r$,
where $t_1$ is the closed upper bound for maximal defined time interval
  $\Phi(y_0,t)$ is defined at least in the same interval of $\Phi(x_0,t)$, where $0 \le t \le t_1$                                
25 Bounded Neighborhood Solution (Part 2) If we have another initial condition $y_0$ s.t. $\vert \vert x_0 - y_0\vert \vert \le e^{-Lt_1}r$,
where $t_1$ is the closed upper bound for maximal defined time interval
$\vert \vert \Phi(x_0, t)-\Phi(x_0, t_0)\vert \vert \le M\vert t-t_0\vert$ $\vert \vert \Phi(x_0, t_0)-\Phi(y_0, t)\vert \vert \le 2e^{Lt}\vert \vert x_0 - y_0\vert \vert +M\vert t_0-t\vert$ where we only have an initial point $(x_0,t_0)$, and we bound another solution up to time $t\in[0,t_1]$                                
26 Differentiability of Flow and Linearized Solution Given an ODE: $\vec{x}’(t)=\vec{F}(\vec{x})$, and its Flow: $\vec{x}(t)=\Phi(x_0,t)$ The final goal is to show that $\lim\limits_{y \to x_0} \frac{\vert \varphi_{t_0}(x_0)-\varphi_{t_0}(y)-M(t_0)(x_0-y)\vert }{\vert x_0-y\vert } = 0$,where we get $M(t_0)$ being the derivative on the “initial condition $x_0$” (the only other variable is time, whose derivative would be the trivial). $\vec{x}(t)=\Phi(x_0,t)$ is differentiable at $\vec{x}0$ which is given by $D\phi{t_0}(x_0)=M(t_0)$                                
    Let $A(t)\equiv D\vec{F}(\vec{x})$   $M$ is defined by the solution to the ODE: $M’(t)=A(t)M(t)$, $M(0)=I$                                
27 Existence of $M$ Let $A(t)\equiv D\vec{F}(\vec{x}):[0,T]\to \R^{n\times n}$ be a continuous function Proven using Picard Iterates, constructing $M(t) := \lim\limits_{k \to \infty} M_k(t)$ There exists a $M$ that solves the ODE $M’(t)=A(t)M(t)$, and $M(0)=I$                                
      Induction proof that, if $M_0(t)=I$, $\vert A(t)\vert {op}\le L$ for all $t\in[0,T]$, and $M{k+1}(t) = I + \int\limits_0^t A(s) M_k(s) \, ds$                                  
      Then $\vert M_k(t)-M_{k-1}(t)\vert {\text{op}} \leq \frac{(Lt)^k}{k!}$, hence convergence of $\vert M(t)-M_l(t)\vert _{\text{op}} \leq \sum\limits{k=l+1}^\infty \vert M_k(t)-M_{k-1}(t)\vert {\text{op}} \leq \sum\limits{k=l+1}^\infty \frac{(Lt)^k}{k!}$                                  
28   True in general make $A$ into an upper-triangle matrix $B$ by $A=SBS$ $det(e^{tA}) = e^{t\cdot tr(A)}$                                
      Expand the exponent, and show $det(A)^k = det(B)^k = tr(B)^k = tr(A)^k$                                  
29       $\frac{d}{dt}det(M(t))\vert _{t_0} = tr(A(t_0))det(M(t_0))$                                
30   If $M$ is differentiable Follows from above (29) $det(M(t))=det(M(0)\cdot e^{\int_0^ttr(A(s))ds})$                                
31 Liouville Theorem If $F$ is a divergence free vector field (for the nonlinear differential system) $detD\varphi_{t_0}(x_0)=detM(t_0)=detM(0)=1$ The flow $\varphi_{t_0}$ preserves volume (i.e. two solution $\varphi_{t_0}(x_0),\varphi_{t_0}(y_0)$ occupies the same volume in space)                                
32   If given an equilibrium point $\bar{x}$ For that equilibrium point, $M’(t)=AM(t)$, $A$ independent of $t$ For the solution $\bar{x}$, $M(t)=e^{At}$                                
33 Stability of Equilibrium Point For an equilibrium point $\bar{x}$, if $DF(\bar{x})$ has eigenvalues with only negative real parts   $\bar{x}$ is asymptotically stable                                
34 Stability of Equilibrium Point For an equilibrium point $\bar{x}$, if $DF(\bar{x})$ has eigenvalues with at least one positive real parts   $\bar{x}$ is not stable                                
35 Stability of Equilibrium Point For an equilibrium point $\bar{x}$, if $DF(\bar{x})$ has eigenvalues with at least one purely imaginary   inconclusive                                
36 Lyapunov’s criterion for stability If we can find a function $L$, such that:
$L(\bar{x})$ is a strict local minimum, and $\nabla L(x) \cdot F(x) \le 0$ for all $x$ near $\bar{x}$
Show that $L(\varphi_t(x_0))$ is monotonously decreasing by showing its derivative being $\le 0$ $\bar{x}$ is stable                                
37   If we can find a function $L$, such that:
$L(\bar{x})$ is a strict local minimum, and $\nabla L(x) \cdot F(x) < 0$ for all $x$ near $\bar{x}$
  $\bar{x}$ is asymptotically stable                                
38 Stability of Gradient System If $F=-\nabla V$, and $\bar{x}$ is a local min for $V$ Show that $V$ is a Lyapunov Function, hence $<\nabla V(x), F(x)> \le 0$ $\bar{x}$ is a stable equilibrium point                                
      If the function $V$ is found to be $\nabla V(x)\cdot F(x) < 0$ $\bar{x}$ is an asymptotically stable equilibrium point                                
39 Stability of Hamiltonian System If $F=J \nabla H$, and $\bar{x}$ is a local min for $H$ Same as above, but have $<\nabla H(x), F(x)> = 0$ $\bar{x}$ is only a stable equilibrium point (not asymptotically stable)                                
    $J=\begin{bmatrix}0 & 1& \ -1 & 0 & \ & & 0 & 1\ & & -1 & 0\&&&&… & \&&&&&0&1\&&&&&-1&0\end{bmatrix}$                                    
40 Property of Hamiltonian System   Show that $\frac{d}{dt}H(x)=0$ The quantity $H$ is preserved                                
      If stuck, first show that $\vec{w} \cdot J\vec{w}=0$                                  
                                         
                                         
                                         

Tricks

Tricks for Computing Matrix Exponentials with Inhomogeneous Term

Basically, we you are dealing with terms such as $e^{tA}\vec{f}$, and $A$ is diagonalizable:

  • you should decompose the vector $\vec{f}$ into the eigenvectors
  • this will be always used since for non-diagonalizable $A$, you can use $A=L+D$, where $L$ is diagonalizable.

For example, in HW4:

  • Question:

image-20201019172311501

  • Trick:

image-20201019171831983