This problem set is completely based on lectures and your ingenuity. No foreshadowing. You can optionally not use Juliabox at all, though using Julia may enhance the experience. This entire set can be done with pencil and paper if you prefer. There should be no mention of AAᵀ,AᵀA or eigenvalues. You should not google or use wikipedia, just what you learned in lecture.
Consider the matrix A defined below.
A = [1 4 2;2 8 4;-1 -4 -2]
3×3 Array{Int64,2}: 1 4 2 2 8 4 -1 -4 -2
You should be able to describe the column space of A without any fancy svd's, just by common sense.
1a. Please describe the column space.
This matrix can be expressed as an outer product, A=xyᵀ.
1b. Find an x and y such that A=xyᵀ.
Without a computer you should be able to write down a compact (rank r format) svd of A.
1c. What is U, Σ, and V? Write an exact answer not a decimal.
If you wish to check your work, or are not sure what an exact answer would look like, one option (nothing required here!) is to execute the Julia code, but you will get decimals which you will have to think a bit about to recognize as exact numbers.
using LinearAlgebra
r = rank(A)
U,s,V = svd(A)
display(U[:,1:r])
display(s[1:r])
display(V[:,1:r])
3×1 Array{Float64,2}: -0.40824829046386313 -0.8164965809277259 0.40824829046386296
1-element Array{Float64,1}: 11.22497216032183
3×1 Array{Float64,2}: -0.21821789023599225 -0.8728715609439696 -0.43643578047198484
1a. Notice that the second column is 4 times the first column and the third column is 2 times the first column. Thus, the column space is the set of all real multiples of the vector
$$v=\begin{pmatrix}1\\2\\-1\end{pmatrix}$$in $\mathbb{R}^3$.
1b. With $v$ as in part 1a, the matrix is
$$A=\begin{pmatrix}1&4&2\\2&8&4\\-1&-4&-2\end{pmatrix}=\begin{pmatrix}v&4v&2v\end{pmatrix}=v\begin{pmatrix}1&4&2\end{pmatrix}.$$Therefore, $A=xy^T$, where
$$x=v=\begin{pmatrix}1\\2\\-1\end{pmatrix}, \hspace{2cm} y=\begin{pmatrix}1\\4\\2\end{pmatrix}.$$1c. By part 1a, the rank of $A$ is $r=1$. By part 1b, we know $A=xy^T=x \begin{pmatrix}1\end{pmatrix} y^T$, which looks like an SVD, but $x$ and $y$ are not orthogonal $3\times 1$ matrices. To fix this, we can divide $x$ and $y$ by their norms:
$$A=\left(\frac{x}{|x|}\right) \begin{pmatrix}|x||y|\end{pmatrix} \left(\frac{y}{|y|}\right)^T.$$Thus, our answer is $$U=\frac{x}{|x|}=\frac{1}{\sqrt{6}}\begin{pmatrix}1\\2\\-1\end{pmatrix},$$
$$\Sigma=\begin{pmatrix}|x||y|\end{pmatrix}=\begin{pmatrix}\sqrt{6}\sqrt{21}\end{pmatrix}=\begin{pmatrix}3\sqrt{14}\end{pmatrix},$$$$V=\frac{y}{|y|}=\frac{1}{\sqrt{21}}\begin{pmatrix}1\\4\\2\end{pmatrix}.$$Note that the $U$ and $V$ we give here are actually the negatives of the $U$ and $V$ given in the output of the Julia code above, but both are valid SVDs.
U = [1 -3;3 1] / √10
Σ = [√45 0; 0 √5]
V = [1 -1;1 1] / √2
2×2 Array{Float64,2}: 0.707107 -0.707107 0.707107 0.707107
2a. Is A invertible? (Do not multiply through) Why or why not?
2b. What is the column space of A?
2c. What is the nullspace of A?
2d. In factored form (no need to multiply out) write the rank 1 approximation to A corresponding to the image compression notebook we saw in class.
2e. Find an SVD of the matrix AV without multiplying out AV
2f. Find an SVD of the matrix AVV (you can multiply out V*V using pencil and paper)
You can optionally check your work using Julia.
2a. $A$ is invertible. Because $U$ and $V^T$ are orthogonal square matrices, they are invertible (their inverses are $U^T$ and $V$ respectively). The matrix $\Sigma$ is invertible because it is a diagonal square matrix with nonzero entries on the diagonal. Therefore, $A^{-1}=(U\Sigma V^T)^{-1}=V\Sigma^{-1}U^T$.
2b. The column space of $A$ is all of $\mathbb{R}^2$. To see this, recall
$$\operatorname{col}(A)=\{Ax~:~x \in \mathbb{R}^2\}.$$For any $y \in \mathbb{R}^2$, we have $y=A(A^{-1}y)$, so $y \in \operatorname{col}(A)$.
2c. The nullspace of $A$ is just the zero vector. If $Ax=0$, then $x=A^{-1}0=0$.
2d. To get the rank 1 approximation to $A$, we keep the larger singular value and throw away the smaller:
$$\text{rank 1 approx.}=\begin{pmatrix}1/\sqrt{10}\\3/\sqrt{10}\end{pmatrix}\begin{pmatrix}\sqrt{45}\end{pmatrix}\begin{pmatrix}1/\sqrt{2}\\1/\sqrt{2}\end{pmatrix}^T.$$2e. We have $AV=U\Sigma V^TV=U \Sigma I_2$, where $I_2$ is the $2\times 2$ identity matrix. $I_2$ is orthogonal, so $U\Sigma I_2^T$ is an SVD for $AV$.
2f. We have $AVV=U\Sigma I_2 V=U\Sigma V=U\Sigma(V^T)^T$. This is an SVD for $AVV$.
3a. Are the singular values of 2A always double that of A? If yes, why? If not give a counterexample.
3b. Are the singular values of -2A always -2 those of A? If yes, why? If not give a counterexample. (Be careful.)
3a. Yes. If $A=U\Sigma V^T$, then $2A=U(2\Sigma)V^T$. If the diagonal entries of $\Sigma$ are $\sigma_1 \ge \sigma_2 \ge \cdots \ge \sigma_r > 0$, then the diagonal entries of $2\Sigma$ are $2\sigma_1 \ge 2\sigma_2 \ge \cdots \ge 2\sigma_r > 0$.
3b. No. The reason is that singular values are always positive. For a counterexample, let
$$A=I_2=\begin{pmatrix}1&0\\0&1\end{pmatrix}.$$The singular values of $A$ are $1$ and $1$. As for $-2A$, we have
$$-2A=\begin{pmatrix}-2&0\\0&-2\end{pmatrix}=\begin{pmatrix}-1&0\\0&-1\end{pmatrix}\begin{pmatrix}2&0\\0&2\end{pmatrix}\begin{pmatrix}1&0\\0&1\end{pmatrix}^T.$$Thus, the singular values of $-2A$ are $2$ and $2$. Really, any nonzero matrix $A$ works as a counterexample.
4a. Let A=[1 2 3]. Describe the nullspace of A precisely as which geometric object with what normal?
4b. Let A=[1 2 3]ᵀ. What is the nullspace of A?
4a. The nullspace of $A$ is the set $$\operatorname{null}(A)=\{x \in \mathbb{R}^3~:~Ax=0\}=\{(x,y,z) \in \mathbb{R}^3~:~x+2y+3z=0\}.$$ Thus, we see that the nullspace is the plane in $\mathbb{R}^3$ containing the point $(0,0,0)$ and with normal vector $$n=\begin{pmatrix}1\\2\\3\end{pmatrix}.$$
4b. The nullspace of $A$ is the set $$\operatorname{null}(A)=\{x \in \mathbb{R}~:~Ax=0\}=\left\{x \in \mathbb{R}~:~\begin{pmatrix}x\\2x\\3x\end{pmatrix}=\begin{pmatrix}0\\0\\0\end{pmatrix}\right\}=\{0\}.$$