18.06 Spring2019 pset 10
due Friday 5/10 at 11:59pm
Comment about Markov Matrix definition: Strang's book glosses over this issue, but on page 474 he says that a Markov matrix has positive elements and columns add to 1. Many authors including this wikipedia page say that a Markov matrix is allowed 0's. Also Strang focuses on left stochastic matrices, some authors use right stochastic matrices where rows add to 1.
While this inconsistency is enough to drive anyone crazy, I will say positive Markov matrix when I wish to imply all entries are positive from now on. Positive Markov matrices have one eigenvalue 1 and the rest have absolute value less than 1. General Markov matrices can have more than one eigenvalue with absolute value 1.
(1) (GS prob 1, page 480) What is the steady state eigenvector of
A = [.9 .15
.1 .85]
2×2 Array{Float64,2}: 0.9 0.15 0.1 0.85
This you should do by hand and can check if you like with Julia.
using LinearAlgebra
λ,X = eigen(A)
X[:,1]/sum(X[:,1])
(2) Show that the square of a positive Markov matrix is a positive Markov matrix. Also show that the square of a general Markov matrix is also Markov. (Hint: there is an easy way and a more difficult way.)
(3) What can you say about the steady state of a symmetric positive Markov matrix?
(4) Show that if $M$ is positive definite , so is $M^{-1}$.
(5a) We have mentioned that the image of the unit ball under A is an ellipsoid. This means that the set of all y=Ax such that ||x||=1 is an ellipsoid. Show that if A is square and invertible, the equation of this ellipsoid is $$y^T(AA^T)^{-1}y=1.$$
(5b) Turning this around if $M$ is positive definite, the equation $y^T M y=1$ is an ellipsoid. Show that it has semi-axes equal to the sqrts of the reciprocals of the eigenvalues of $M$ and the directions are the eigenvectors. (Might be useful to see page 355 of GS)
(This shows up in many areas including the axes of inertia in physics. History has classified this as an eigen-fact, but you all know that this is really about the SVD of A, as the ellipsoid is most directly related to multiplying the unit sphere by $\Sigma$)
(6) For which values of $c$ is the matrix below positive definite?
\begin{pmatrix} c & 1 & 1 \\ 1 & c & 1 \\ 1 & 1 & c \end{pmatrix}
If Interact happens to work you can try, if you like, playing with this in julia:
using Interact, LinearAlgebra
@manipulate for c=.5:.01:5
round.( eigvals( [c 1 1 ;1 c 1;1 1 c]) , digits=3)
end
The eigenvalues of this 3x3 matrix is readily found by first geting the eigenvalues of the 3x3 ones matrix and then doing what? This method requires almost no pencil and paper, and more importantly helps you understand why the eigenvalues are what they are.
(7) In class and in the textbook, positive-definiteness was defined only for symmetric matrices. However, it can be extended to arbitrary square matrices as follows.
The matrix $A$ is positive-definite if $A + A^T$ is positive-definite (in the sense defined in class and in the book: $A + A^T$ has positive eigenvalues, or equivalently $x^T(A+A^T)x \gt 0$ for all $x \ne 0$, or equivalently $A^T + A = B^T B$ where $B$ has full column rank). Show that if $A + A^T$ is positive definite, then the eigenvalues of $A$ have positive real parts.
(Hint: consider an eigenvector $Ax = \lambda x$. How can you use this in one of the positive-definite conditions for $A+A^T$ above?)
(8) Suppose a symmetric positive definite matrix $A$ has a unique largest eigenvalue. While pure mathematicians would say that $A^{2019}$ has full rank, applied folks would say that it is very likely rank $1$ for all intents and purposes. Explain this with words or a Julia example (2019 may be too large on a computer).
(9) Consider the function $f(x, y) = x^3 − 12xy + 8y^3$. Show that the gradient vanishes at exactly two points, (0,0) and (2,1). One of these is a saddle point, and one a local minimum. Which ones and why?
(10) Use the full svd to derive the so called "polar factorization" of a matrix: Every square matrix $A$ can be factored into $QS$ where $Q$ is orthogonal and $S$ is positive semi-definite. If $A$ is invertible further show $S$ is positive definite. (Hint: if $A=U\Sigma V^T$, the orthogonal matrix $Q$ will be $UV^T$. What do we need to complete the story.) (Hint: if you don't yet see the answer, the polar decomposition is on page 394 of Strang.)