Due 11am on Friday, December 2.
Suppose we are solving the ODE $\frac{dx}{dt} = Ax$ with a given initial condition $x(0)$, where $A$ is an $m \times m$ matrix.
(a) Show that we can obtain the solutions $x(\Delta t), x(2\Delta t), x(3\Delta t), \ldots$ at a sequence of discrete times $n\Delta t$ from $Bx(0), B^2x(0), B^3 x(0), \ldots$, i.e. by multiplying each preceding solution by some matrix $B$. (For example, we might use this to plot the solution at a sequence of points.) What is $B$?
(b) We know from class that $x(t)$ must be exponentially decaying if all of the eigenvalues of $A$ have negative real parts. $B^n x(0)$ is decaying if all of the eigenvalues of $B$ have ____________ — why must this follow if the eigenvalues of $A$ have negative real parts?
Suppose that $A$ is a $3 \times 3$ real-symmetric matrix. (Recall from class that such a matrix has real eigenvalues and orthogonal eigenvectors.) Suppose its eigenvalues are $\lambda_1 = 1, \lambda_2 = -1, \lambda_3 = -2$, and corresponding eigenvectors are $x_1,x_2,x_3$. You are given that $x_1 = [1,0,1]$ (denoting a column vector ala Julia).
(a) Give an approximate solution at $t=100$ to $\frac{dx}{dt}=Ax$ for $x(0) = [1,1,0]$. (Give a specific quantitative vector, even if the vector is very big or very small; an answer like "$\approx 0$" or "$\approx \infty$" is not acceptable.)
(b) If $x_2 = [0,1,0]$, give a possible $x_3$. (You should not use these $x_2, x_3$ to solve part (a).)
Let $X = \begin{pmatrix} x_1 & x_2 & \cdots & x_m \end{pmatrix}$ denote eigenvectors of the diagonalizable $m \times m$ complex matrix $A$, with corresponding eigenvalues $\lambda_1, \ldots, \lambda_m$.
(a) The eigenvalues of $A^H$ must be ________? Check your answer for a random complex matrix in Julia, computed with A = rand(ComplexF64, 5,5)
; note that $A^H$ in Julia is A'
.
(b) If $A$ is real (so that $A^H = A^T$), why is your answer in (a) consistent with the statement in class that $A$ and $A^T$ have identical eigenvalues?
(c) From $A = X \Lambda X^{-1}$, derive a relationship between eigenvectors $y_1, \ldots, y_m$ of $A^H$ (the "left eigenvectors" of $A$) and the rows or columns of $X^{-1}$.
(d) Using the eigenvectors $y_k$ from (c), what must be true of $y_1^H x_2$ (and similarly for other dot products)?
Suppose that $A$ is an $m \times m$ real-symmetric matrix ($A = A^T$). Consider the function:
$$ f(x) = \frac{x^T A x}{x^T x} $$that take as input a real nonzero vector $x \in \mathbb{R}^m$ and returns a real number.
(a) Compute the gradient $\nabla f$ (with respect to $x$).
(b) Show that $\nabla f = 0$ if and only if $x$ is some eigenvector of $A$, in which case $f(x)$ is equal to the eigenvalue!
(c) If the (real) eigenvalues of $A$ are $\lambda_1 \ge \lambda_2 \ge \cdots \ge \lambda_m$, then what are the minimum and maximum possible values of $f(x)$?