(1) Warmup: this should perhaps be second nature by now:
(A) If r=m=n, (square), Ax=b has ________ solution(s).
(B) If m>r=n, (tall skinny, full column rank), Ax=b has _____ or ______ solution(s)
(C) If m=r<n, (short wide,full row rank), Ax=b has ____________ solution(s).
(D) If r<m,r<n, Ax=b has ________ or ____________ solutions.
Solution
(A) If $r=m=n$, (square), $Ax=b$ has $\boxed{1}$ solution.
(B) If $m>r=n$, (tall skinny, full column rank), $Ax=b$ has $\boxed{0}$ or $\boxed{1}$ solution.
(C) If $m=r < n$, $Ax=b$ has $\boxed{\text{infinitely many}}$ solutions.
(D) If $r<m$, $r<n$, $Ax=b$ has $\boxed{0}$ or $\boxed{\text{infinitely many}}$ solutions.
(2) Warmup: A matrix is m x n what are the possible dimensions?
(A) dim(col(A))?
(B) dim(row(A))+dim(null(A))?
(C) the sum of the dimensions of the four fundamental subspaces?
(D) dim(col(A)) + dim(row(A))?
Solution
For the solution, recall the Fundamental Theorem of Linear Algebra: dim (col ( A )) = dim (row ( A )) = r and dim (\nul ( A )) = $n-r$ and \dim (\nul ( A^T )) = $m-r$.
(A) dim(col(A)) can be at most $\min(m,n)$.
(B) dim(row(A))+dim(nul(A)) = dim(col(A))+dim(nul(A)) = $n$.
(C) The sum of the dimensions of the four fundamental subspaces is $r+r+(n-r)+(m-r) = n+m$.
(D) dim(col(A)) + dim(row(A)) = $r+r = 2r$.
(3) Consider the vector space Poly4 of polynomials of the form $P(x)=a+bx+cx^2+dx^3+ex^4$.
The operator $d/dx$ takes $P(x)$ to $P'(x)$.
(A) What kinds of functions should
we say live in the nullspace of $d/dx$?
(B) What kinds of functions are in the analog
of the column space? (If the column space is {Av} for column vectors, then perhaps it should
be {P' for P in Poly4}.)
(C) What should we call the rank of $d/dx$ acting on Poly4?
(D)
If P(x) is in the "column space", what is the analog of the solution to Ax=b?
(E) are there 0,1, or infinitely
many solutions?
Solution
(A) The nullspace consists of all polynomials $P(x)$ such that the operator sends $P(x)$ to zero, i.e. $\frac{d}{dx} P(x) = 0$. The only polynomials that have zero derivative are constants, so nul$(\frac{d}{dx}) = \left\{ P(x) = a \mid a \in \mathbb{R} \right\}$.
(B) col$(\frac{d}{dx}) = \left\{ \frac{d}{dx} P(x) \mid P(x) \in \mathrm{Poly}4 \right\} = \left\{ P(x) = a + bx + cx^2 + dx^3 \mid a, b, c, d \in \mathbb{R} \right\} = \mathrm{Poly}3$, because we can only obtain polynomials of degree at most three by differentiating polynomials of degree four, and we can get any polynomial of degree three this way.
(C) Since column space of $\frac{d}{dx}$ is generated by monomials $1$, $x$, $x^2$ and $x^3$, and since they are linearly independent, they form a basis for the column space. Therefore, the dimension of the column space, which is rank of the operator, is four.
(D) We can rewrite the equation in the form $\frac{d}{dx} Q(x) = P(x)$, where we write $P(x)$ for a function in the column space and $Q(x)$ for a solution that we are trying to find. So we see that we have an easy differential equation, so by the Fundamental Theorem of Calculus, we have that $Q(x) = \int_{0}^{x} P(t) dt + const$.
(E) If $P(x)$ is in the column space (that is, if $\deg P(x) \leq 3$), then there are infinitely many solutions, because adding a constant to a solution yields another solution. If $P(x)$ is not in the column space, then there are no solutions.
(4) This is GS p. 179 problem 39 with hints. (no need to see the original)
Suppose A is 5 by 4 with rank 4.
(A) The columns of A are necessarily/possibly/never independent? (circle the best answer)
(B) The columns of A necessarily/possibly/never for a basis for $R^5$
(C) Suppose there is a vector $b$ which makes $[A \ b]$ an invertible square matrix. Then the
columns of $[A \ b]$ are necessarily/possibly/never independent?
(D) Under the assumption in (C), $Ax=b$ can not be solved. Why not?
(E) What is the possible rank of $[A \ b]$ if $[A \ b]$ is not invertible? Why?
(F) Show that if $[A \ b]$ is singular (not invertible) that Ax=b must be solvable.
Solution
(A)The columns of $A$ are necessarily independent, because the dimension of the column space is four by assumption, and there are exactly four columns.
(B) The columns of $A$ are never a basis for $\mathbb{R}^5$, because four vectors cannot span a five-dimensional space.
(C) Since the matrix $B = \left[A \, b\right]$ is invertible, the system $Bx = b$ always has exactly one solution, and that only happens when the columns of $B$ are independent. So the columns of $\left[A \, b\right]$ are necessarily independent.
(D) Since $b$ is linearly independent from the columns of $A$, it does not lie in the linear span of those, which means that $b$ is not in the column space of $A$. Hence the system $Ax = b$ never has a solution.
(E) By assumption, rank of $A$ is four, so rank of $\left[A \, b\right]$ is at least four. If the matrix is not invertible, it means that the columns of $\left[A \, b\right]$ are linearly dependent. This only happens when $b$ lies in the column space of $A$, so by adding $b$, we get the same column space, and hence rank of $\left[A \, b\right]$ is still four.
(F) By previous argument, $b$ lies in the column space of $A$, hence $Ax = b$ is necessarily solvable.
(5) This is exactly GS p. 191 problem 17. This problem is completely doable by eyeballing and pencil and paper, but you may use Julia if you like.
Describe the four subspace of $R^3$ associated with
# (5A)
A = [ 0 1 0;0 0 1;0 0 0]
3×3 Array{Int64,2}: 0 1 0 0 0 1 0 0 0
# and (5B)
using LinearAlgebra
I + A
3×3 Array{Int64,2}: 1 1 0 0 1 1 0 0 1
Solution
Let us introduce the following notation: $$ e_1 = \begin{pmatrix} 1\\0\\0 \end{pmatrix} \mbox{, } e_2 = \begin{pmatrix} 0\\1\\0 \end{pmatrix} \mbox{, } e_3 = \begin{pmatrix} 0\\0\\1 \end{pmatrix} \mbox{.}$$
(A) $\text{col}(A) = \text{Span}\left( e_1, e_2 \right)$, so it the the $xy$-plane;
$\text{nul}(A) = \text{Span}(e_1)$;
$\text{row}(A) = \text{Span}(e_2, e_3)$;
$\text{nul}(A^T) = \text{Span}(e_3)$.
(B) $\text{col}(A) = \text{Span}(e_1, e_1+e_2, e_2+e_3) = \text{Span}(e_1,e_2,e_3) = \mathbb{R}^3$;
$\text{nul}(A) = 0$;
$\text{row}(A) = \text{Span}(e_1+e_2,e_2+e_3,e_3) = \text{Span}(e_1,e_2,e_3) = \mathbb{R}^3$;
$\text{nul}(A^T) = 0$
(6) Suppose we have $A=[u \ w][v \ x]^T$ without any assumptions whatsoever on the vectors u,v,w,x other than is required by block notation.
(A) What are the possible ranks of $A$ assuming $A$ is $m \times n$?
(B) Under what conditions is the rank of $A$ 0?
(C) Under what conditions is the rank of $A$ 1?
Solution
First, let's multiply $A$ out: $A = uv^T + wx^T$. From the dimensions of $A$, we can see that $u$ and $w$ are $m$-vectors and $v$ and $x$ are $n$-vectors. Let us write $A$ in block notation, writing $u$ and $w$ as single blocks and $v$ and $x$ in components: $$ A = u \begin{pmatrix} v_1 & \cdots & v_n \end{pmatrix} + w \begin{pmatrix} x_1 & \cdots & x_n \end{pmatrix} = \begin{pmatrix} v_1u + x_1w & \cdots & v_nu + x_nw \end{pmatrix} \mbox{.} $$
(A) $A$ can be of rank 0, 1 or 2, because every column of $A$ is a linear combination of $u$ and $w$.
(B) The rank of $A$ is zero exactly when $A$ is itself zero, so for each $i$, we need to have $v_iu = -x_iw$. So if for at least one $i$ the coefficients $v_i$ and $x_i$ are both nonzero, then $u$ is parallel $w$, that is. In addition, if both $u$ and $w$ are nonzero, then we can write $w = \lambda u$ for some $\lambda \in \mathbb{R}$ and conclude that the ratio $\frac{v_i}{x_i}$ should be independent of $i$ and equal to $-\lambda$, that is $v$ is parallel to $x$.
To sum up, $A$ has zero rank in one of the following cases:
(C) The matrix $A$ is rank one when it can be written as a product $A = yz^T$ of a column vector by a row vector. Similarly to part (b), by writing in components we can get the following cases:
(7) Write the unique matrices $U_1$ and $U_{-1}$ that makes these equalities: (the row vector contain functions of x)
(A) $[1 \ \ (x+1) \ \ (x+1)^2 \ \ (x+1)^3 \ \ (x+1)^4] \ = \ [1 \ \ x \ \ x^2 \ \ x^3 \ \ x^4] \ U_{1}$
(B) $[1 \ \ (x-1) \ \ (x-1)^2 \ \ (x-1)^3 \ \ (x-1)^4] \ = \ [1 \ \ x \ \ x^2 \ \ x^3 \ \ x^4] \ U_{-1}$
(C) Suppose $p(x)$ is a polynomial of degree at most 4 written as $a+bx+cx^2+dx^3+ex^4$ the polynomial whose coefficients are represented by $$U_1 \begin{pmatrix} a \\ b \\ c \\ d \\ e \end{pmatrix}$$ is (choose one)
a) $p(2x)$ b) $p(x+1)$ c)$p(x-1)$ d)$p(x^2)$
(D) argue without any calculation or mention of the entries of $U_1$ and $U_{-1}$ that $U_1$ and $U_{-1}$ must be inverses of each other. (Remark: Isn't that cool that you can understand an inverse rather than compute it? ) \br
(E) What is the polynomial obtained by applying $U_1^3$ to a vector of coefficients? (no computation please, just brain power)
a) $p(x+3)$ b) $p(x-3)$ c) $p(x)^3$ d)$p(x^3)$
Solution
(A) \begin{align*}&\begin{pmatrix} 1 & x+1 & (x+1)^2 & (x+1)^3 & (x+1)^4 \end{pmatrix} \\ &= \begin{pmatrix} 1 & x+1 & x^2 + 2x + 1 & x^3 + 3x^2 + 3x + 1 & x^4 + 4x^3 + 6x^2 + 4x + 1 \end{pmatrix} \\ &= \begin{pmatrix} 1 & x & x^2 & x^3 & x^4 \end{pmatrix} \begin{pmatrix} 1&1&1&1&1 \\ &1&2&3&4 \\ &&1&3&6 \\ &&&1&4 \\ &&&&1 \\ \end{pmatrix}\end{align*}.
(B) \begin{align*}&\begin{pmatrix} 1 & x-1 & (x-1)^2 & (x-1)^3 & (x-1)^4 \end{pmatrix} \\ &= \begin{pmatrix} 1 & x-1 & x^2 - 2x + 1 & x^3 - 3x^2 + 3x - 1 & x^4 - 4x^3 + 6x^2 - 4x + 1 \end{pmatrix} \\ &= \begin{pmatrix} 1 & x & x^2 & x^3 & x^4 \end{pmatrix} \begin{pmatrix} 1&-1&1&-1&1 \\ &1&-2&3&-4 \\ &&1&-3&6 \\ &&&1&-4 \\ &&&&1 \\ \end{pmatrix}\end{align*}.
(C) Note that $$ p(x) = \begin{pmatrix} 1 & x & x^2 & x^3 & x^4 \end{pmatrix} \begin{pmatrix} a\\b\\c\\d\\e \end{pmatrix} \mbox{,} $$ then the new polynomial is: $$ \begin{pmatrix} 1 & x & x^2 & x^3 & x^4 \end{pmatrix} U_1 \begin{pmatrix} a\\b\\c\\d\\e \end{pmatrix} = \begin{pmatrix} 1 & x+1 & (x+1)^2 & (x+1)^3 & (x+1)^4 \end{pmatrix} \begin{pmatrix} a\\b\\c\\d\\e \end{pmatrix} \mbox{.} $$ Hence, by performing matrix multiplication, we get the answer $\boxed{p(x+1)}$.
(D) $U_1$ and $U_{-1}$ can both be viewed as linear transformations of the space $\mathrm{Poly}4$ when we multiply the vector of coefficients by $U_1$ or $U_{-1}$ as in part (c). The former, $U_1$, takes a polynomial $p(x)$ and produces $p(x+1)$, while the latter, takes $p(x)$ and transforms it to $p(x-1)$. These operations are inverses of each other, and therefore, the corresponding matrices are inverses of each other.
(E) Applying $U_1$ thrice corresponds to performing the operation $p(x) \mapsto p(x+1)$ thrice, therefore the answer is $\boxed{p(x+3)}$.
(8) (Exact Copy of Strang p.178 problem 29) What subspace of 3x3 matrices is spanned (take all combinations) by
(A) The invertible matrices?
(B) The rank one matrices?
(C) The identity matrix?
Solution
(A) Invertible matrices contain the following diagonal matrices: $M_1 = \text{diag}(2,1,1)$, $M_2 = \text{diag}(1,2,1)$, $M_3 = \text{diag}(1,1,2)$ and $I = \text{diag}(1,1,1)$. By taking linear combinations $M_i - I$, we get that the following three diagonal matrices are spanned: $\text{diag}(1,0,0)$, $\text{diag}(0,1,0)$ and $\text{diag}(0,0,1)$. Moreover, note that matrices $M_{ij}$, $i\neq j$ that have ones on the diagonal, one on the $(i,j)$-th position and zeroes everywhere else, are invertible. So matrices $M_{ij} - I$ are also linearly spanned by invertible matrices. So we got that all matrix units -- matrices with 1 in one place and 0 everywhere else -- are spanned by invertible matrices, but they form a basis of the space of all $3\times3$ matrices, so invertible matrices span all matrices.
(B) Matrix units are rank one, so again, all matrices are spanned.
(C) One nonzero element always spans a line. Matrices that are spanned by $I$ are called \textit{scalar matrices}.
(9)
(A) Are the six three by three permutation matrices linearly dependent or linearly independent?
(B) Do they span the nine dimensional space of 3x3 matrices or a subspace (if so, what dimension?)
You can do this with pencil and paper or optionally if you like code you can use the Julia code to help you:
(explanations of all the syntax are provided) (C) Describe the subspace of 3x3 matrices spanned by the permutation matrices.
Note: learning Julia is not required for this course. You can optionally ignore the Julia for this problem, execute the Julia without understanding it, or try to understand the syntax. Your choice depending on your interests and background. The minimial time solution may be the second choice for many of you (look at the output without trying to fully understand the syntax. The maximal time: do it by hand, and learn julia syntax, would be the maximal education opportunity.)
# The following code displays the permutation matrices for your convenience
Permutations = [[1,2,3],[1,3,2],[2,1,3],[2,3,1],[3,1,2],[3,2,1]] # all permutations of 1:3 in 6 vector of 3 vectors
Identity = [1 0 0;0 1 0;0 0 1] # 3x3 identity matrix
PermutationMatrices = [Identity[p,:] for p in Permutations] # 6 vector of PermutationMatrices
display.(PermutationMatrices); # Apply the display function elementwise just to see the Permutation Matrices
3×3 Array{Int64,2}: 1 0 0 0 1 0 0 0 1
3×3 Array{Int64,2}: 1 0 0 0 0 1 0 1 0
3×3 Array{Int64,2}: 0 1 0 1 0 0 0 0 1
3×3 Array{Int64,2}: 0 1 0 0 0 1 1 0 0
3×3 Array{Int64,2}: 0 0 1 1 0 0 0 1 0
3×3 Array{Int64,2}: 0 0 1 0 1 0 1 0 0
# Compute a 9x6 matrix of squashed permutation matrices
#vec flattens a 3x3 matrix to a 9 vector and hcat stores this in a matrix. The three dots are called splat, and collect the arguments.
A = hcat(vec.(PermutationMatrices)...)
9×6 Array{Int64,2}: 1 1 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 1 0 0 1 1 0 0 1 0 0 0 0 1 0 1 0 0 1 0 0 0 0 0 1 1 0 1 0 1 0 0 1 0 1 0 0 0
# Compute the rank of the above 9x6 matrix
using LinearAlgebra
rank(A)
5
# Compute the full SVD, to identify the nullspace vector
U,s,V = svd(A,full=true)
V
6×6 Adjoint{Float64,Array{Float64,2}}: -0.408248 4.61882e-17 0.0 -0.0 0.816497 -0.408248 -0.408248 0.71506 0.362882 0.153858 -1.60247e-17 0.408248 -0.408248 -0.484619 -0.225347 0.617276 1.66112e-16 0.408248 -0.408248 -0.316804 0.63191 -0.018032 -0.408248 -0.408248 -0.408248 0.316804 -0.63191 0.018032 -0.408248 -0.408248 -0.408248 -0.230441 -0.137535 -0.771134 1.38357e-16 0.408248
Solution
(A) They are linearly dependent with the following nontrivial relation: $$ \begin{pmatrix} 1&& \\ &1& \\ &&1 \end{pmatrix} + \begin{pmatrix} &1&\\&&1\\1&& \end{pmatrix} + \begin{pmatrix} &&1\\1&&\\&1& \end{pmatrix} = \begin{pmatrix} &&1\\&1&\\1&& \end{pmatrix} + \begin{pmatrix} 1&&\\&&1\\&1& \end{pmatrix} + \begin{pmatrix} &1&\\1&&\\&&1 \end{pmatrix} \mbox{.} $$
(B) Since there are only six of them, they span a proper subspace. The dimension of this subspace is 5, because we can write a linear combination of any five of those and check that the condition on the coefficients implies that the coefficients are all zero. For example: $$ a\begin{pmatrix} 1&&\\&1&\\&&1 \end{pmatrix} + b\begin{pmatrix} &1&\\&&1\\1&& \end{pmatrix} + c\begin{pmatrix} &&1\\1&&\\&1& \end{pmatrix} + d\begin{pmatrix} &&1\\&1&\\1&& \end{pmatrix} + e\begin{pmatrix} 1&&\\&&1\\&1& \end{pmatrix} = 0 $$ implies that $b = c = 0$, and since $b+e = b+d = 0$, we also get $b = c = d = e = 0$. Then $a+e = 0$ implies that $a = b = c = d = e = 0$, and this gives us that the five chosen matrices are linearly independent and hence span a 5-dimensional subspace.
(C) The matrices in the linear span of permutation matrices can alternatively be described by the following conditions:
These four independent conditions then imply another fact about these matrices, which is that the sums of elements in a row and a column are equal. To see this notice that the sum of all elements in the matrix can be equivalently expressed as three times the column sum, or three times the row sum. And so the row sum and the column sum give the same value.
(10) Suppose y₁(x),y₂(x),y₃(x),y₄(x) are four non-zero polynomials of degree at most 2. (This means the functions have the form ax²+bx+c, where at least one of the coefficients is nonzero.) What possibilities are there in the dimension of the vector space spanned by y₁(x),y₂(x),y₃(x),y₄(x)? Give examples for each possibility and explain briefly why no other dimension can happen.
Solution
The space of polynomials of degree at most two $\mathrm{Poly}2$ is three-dimensional, so an offhand upper bound on the dimension is three. Since all the polynomials are nonzero, then the lower bound on the dimension is one. We will prove that in fact every dimension of 1, 2 and 3 can occur by providing examples.