Due Monday September 26 (since Friday Sep 23 is a holiday) at 11am on GradeScope. Submit PDF of your handwritten solutions along with your (clearly labeled) Julia code.
Consider the following $5 \times 5$ matrix, which is almost upper triangular: $$ A = \begin{pmatrix} 2 & -2 & 0 & 1 & 1\\ 2 & -1 & 1 & -2 & 1\\ & 3 & 4 & -3 & 1\\ & & -2 & -5 & -3\\ & & & 7 & 3\\ \end{pmatrix} $$
Corresponding in Julia to:
A = [2 -2 0 1 1; 2 -1 1 -2 1; 0 3 4 -3 1; 0 0 -2 -5 -3; 0 0 0 7 3]
(a)
Compute (by hand) its LU factorization $A = LU$ via Gaussian elimination (i.e. give both $L$ and $U$).
Notice any special pattern to the nonzero entries in L and/or U?
(b)
Why doesn't your hand calculation match the output of Julia's lu(A)
function? Try it, it's not even close! (Do using LinearAlgebra
in order to access the lu
function.)
(c)
Suppose you carried out arithmetic at the same rate, but $A$ was 10 times larger: a $50\times 50$ nearly upper triangular matrix (upper triangular + nonzeros just below the diagonal).
About how much longer should part (a) take? 10 times, 100 times, 1000 times?
Is this different from how it would scale for a general matrix with no special pattern of zero entries?
Consider the three $4 \times 4$ matrices $$ A = \begin{pmatrix} 1 & -3 & 0 & 3\\ & 1 & 1 & 2\\ & & 1 & -1\\ & & & 1\\ \end{pmatrix}, \; B = \begin{pmatrix} -3 & -1 & 1 & -2\\ -3 & 0 & 0 & -3\\ 1 & 3 & 3 & -3\\ 2 & 3 & 2 & 0\\ \end{pmatrix}, \; C = A (AB)^{-1} $$
(a) Compute the third column of $C^{-1}$ without computing the whole inverse of any matrix. Think before plugging-and-chugging!
(b) Check your answer by explicitly computing $C^{-1}$ in Julia.
# fill these in:
A = ??????
B = ??????
C = A / (A*B) # computes A(AB)⁻¹
C^-1 # computes C⁻¹
(Based on Strang problems.)
Which of the following subsets of $\mathbb{R^3}$ are actually subspaces? (Recall that $[x_1,x_2,x_3]$, with commas, denotes a column vector.) If it is not a subspace, give a rule that it violates.
Nowadays, numerical linear algebra programs deal with small pivots by swapping rows, leading to the $PA = LU$ factorization. Prof. Edelman mentioned that, in the elder days, people would occasionally swap columns instead, leading to a factorization $AP = LU$ for some (different) permutation $P$ (and different $L$ and $U$ matrices).
Given a factorization $AP = LU$ for a nonsingular matrix $A$, outline how you would use it to solve $Ax=b$ for $x$ (analogous to how we outlined the use of $PA=LU$ in class).
(You can exploit the fact, explained in section 2.7 of the textbook, that you can invert permutations "for free": $P^{-1} = P^T$ for any permutation matrix $P$, where the transpose $P^T$ of a matrix is simply formed by swapping the rows of $P$ with the columns of $P$. We will spend more time on this later in 18.06.)
Suppose we have a $3 \times 3$ matrix $A$. We will transform this into a new matrix $B$ by doing operations on the rows or columns of $A$ as follows. For each part, explain how to compute $B$ as $B=EA$ or $B=AE$ (say which!) and give the matrix $E$.