Due Friday Feb 11 at 11am on Canvas. Submit PDF(s) of your handwritten solutions along with your (clearly labeled) Julia code.
Suppose the matrix $A$ is a product of 3 matrices (as usual, blanks denote zeros): $$ A = BCD = \underbrace{\begin{pmatrix} 1 & 3 & & 1 \\ & 1 & & 2 \\ & & 1 & 1 \\ & & & 1 \end{pmatrix}}_B \underbrace{\begin{pmatrix} -1 & & & \\ & 2 & & \\ & & -3 & \\ & & & -2 \end{pmatrix}}_C \underbrace{ \begin{pmatrix} & & & 1 \\ & & 1 & -1 \\ & 1 & & 3 \\ 1 & 1 & & 1 \end{pmatrix}}_D $$
(a)
Solve $$ Ax = \begin{pmatrix} 12 \\ 6 \\ 32 \\ 2 \end{pmatrix} $$ for $x$ without using Gaussian elimination.
(Hint: think about how we use LU factorization to solve equations.)
(b)
Check your answer in Julia. The matrices B
, C
, and D
, along with the right-hand-side b
, are entered below for your convenience.
B = [1 3 0 1; 0 1 0 2; 0 0 1 1; 0 0 0 1]
C = [-1 0 0 0; 0 2 0 0; 0 0 -3 0; 0 0 0 -2]
D = [0 0 0 1; 0 0 1 -1; 0 1 0 3; 1 1 0 1]
b = [12, 6, 32, 2]
Consider the $5 \times 5$ matrix: $$ A = \begin{pmatrix} 1 & -1 & & & \\ 2 & -1 & 1 & & \\ & 3 & 4 & -2 & \\ & & -2 & 5 & -2 \\ & & & -1 & 3 \end{pmatrix} $$
(a)
Compute (by hand) its LU factorization $A = LU$ via Gaussian elimination (i.e. give both $L$ and $U$).
Notice any special pattern to the nonzero entries?
(b)
The pattern of nonzeros entries in this matrix $A$ is called tridiagonal, and you should have noticed that Gaussian elimination is a lot easier for a tridiagonal matrix than it would be on an arbitrary $5 \times 5$ matrix.
Julia has a special Tridiagonal
matrix type in its LinearAlgebra library to exploit this structure. You enter it something like this:
using LinearAlgebra # load the LinearAlgebra package
# almost right:
A = Tridiagonal([2, 3, -2, -1], [0,0,0,0,0], [-1, 1, -2, -2])
The above code is not quite right for the matrix $A$ in this problem. Fix it. (Check the Tridiagonal
documentation if it isn't obvious when you run the code.)
Once it's fixed, check your $L$ and $U$ factors from (a) using the Julia lu
function on A
:
lu(A, NoPivot()) # compute the LU factorization, without row swaps ("pivoting")
(c)
Suppose you carried out arithmetic at the same rate, but $A$ was 10 times larger: a $50\times 50$ tridiagonal matrix.
About how much longer would part (a) have taken? 10 times, 100 times, 1000 times?
Consider the following matrices:
$$ U = \begin{pmatrix} 1 & 1 & -1 \\ 0 & 1 & 2 \\ 0 & 0 & 1 \end{pmatrix}, \; L = \begin{pmatrix} 1 & 0 & 0 \\ -1 & 1 & 0 \\ -2 & 1 & 1 \end{pmatrix}, \; B = \begin{pmatrix} 1 & 2 & 3 \\ 3 & 2 & 1 \\ 1 & 0 & 1 \end{pmatrix} $$Let $A = U B^{-1} L$.
(a) Compute the second column of $A^{-1}$. (If you think about it, you can do it without inverting any matrices.)
(b) Check your answer by explicitly computing $A^{-1}$ in Julia.
# fill these in:
U = ...
L = ...
B = ...
A = U * (B \ L) # computes UB⁻¹L
A^-1 # computes A⁻¹
From Strang, section 2.6. Consider $$L = \begin{pmatrix} 1 & 0 & 0 \\ a & 1 & 0 \\ b & c & 1 \end{pmatrix}$$ for some numbers $a,b,c$.
(a) When you perform the usual Gaussian elimination steps to $L$, what matrix will you obtain?
(b) If you apply the same row operations to $I$, what matrix will you get?
(c) If you apply the same row operations to $LB$ for some $3\times n$ matrix $B$, what will you get?
Suppose that $A$ and $B$ are a $100 \times 100$ matrices, and we break each of them into 4 equal blocks ("sub-matrices"): $$ A = \begin{pmatrix} A_1 & A_2 \\ A_3 & A_4 \end{pmatrix}, \qquad B = \begin{pmatrix} B_1 & B_2 \\ B_3 & B_4 \end{pmatrix} $$ where each of the blocks $A_k$ and $B_k$ is a $50 \times 50$ matrix.
Let's write the product $C = AB$, another $100 \times 100$ matrix, as another collection of 4 $50 \times 50$ sub-matrices: $$ C = AB = \begin{pmatrix} C_1 & C_2 \\ C_3 & C_4 \end{pmatrix} $$
(a)
Write a formula for $C_1$ (the upper-left $50 \times 50$ block of $C$) in terms of matrix additions and multiplications of the blocks $A_1,\ldots,A_4$ and $B_1,\ldots,B_4$.
(For example "$C_1 = A_1 B_4 - A_4 B_1 + A_3 B_2$" is the wrong answer, but is the type of thing I'm looking for.)
(If you get confused, maybe draw a picture of how just one element of $C_1$ is computed from a row of $A$ times a column of $B$, and then think about how that dot product relates to sub-matrix products like $A_1 B_1$. Alternatively, try it with a 4×4 matrix and 2×2 blocks.)
(b)
Check your answer with a random matrix in Julia:
# 4 random blocks of A
A1, A2, A3, A4 = rand(50,50), rand(50,50), rand(50,50), rand(50,50)
# 4 randdom blocks of B
B1, B2, B3, B4 = rand(50,50), rand(50,50), rand(50,50), rand(50,50)
# make A and B out of these blocks:
A = [A1 A2
A3 A4]
B = [B1 B2
B3 B4]
# make C = AB
C = A*B
# extract C₁ from the upper-left 50×50 block of C:
C1 = C[1:50, 1:50]
# finally, check that C1 equals (up to roundoff error) your formula from (a):
your_C1 = # ← YOUR (a) HERE ...some formula in terms of A1,A2,A3,A4,B1,B2,B3,B4
C1 ≈ your_C1 # should print "true"
Suppose that we did something similar to Gaussian elimination on a matrix $A$, but with column operations rather than row operations, to turn it into the following matrix $B$:
$$ \underbrace{ \begin{pmatrix} 2 & 4 & 6 \\ 3 & 1 & 10 \\ & -1 & 3 \end{pmatrix}}_A \longrightarrow \underbrace{\begin{pmatrix} 2 & & \\ 3 & -5 & 1 \\ & -1 & 3 \end{pmatrix}}_B $$For example, we subtracted twice the first column of $A$ from the second column of $A$ to get the second column of $B$.
(a)
How did we get the third column of $B$ from columns of $A$?
(b)
You can represent these column operations as a matrix multiplying $A$ on the _________ (left or right).
(c)
Following your answer from (a), write $B$ as a matrix product of $A$ with some other matrix $E$.
(d)
Enter your E
in Julia and check that B
is indeed given by the product in your previous part.
A = [2 4 6; 3 1 10; 0 -1 3]
B = [2 0 0; 3 -5 1; 0 -1 3]
E = # YOUR ANSWER HERE
# check:
B == # A*E or E*A ?