There is of course much more to linear algebra; to really learn it we strongly recommend the texts by MIT Prof. G. Strang, and the subject 18.06. In 10.10 we will try to rapidly survey a few of the most important and relevant results, which will help us to understand how the computer is solving the equations, why it doesn’t always work, and (if that happens) how to diagnose the problem.
The vector-vector and matrix-vector inner products follow the simple distributive laws:
(V1 + V2)*(V3+V4) = V1*V3 + V1*V4 + V2*V3 + V2*V4
(A + B) * (V1 + V2) = A*V1 + A*V2 + B*V1 + B*V2
(V’s are vectors, A & B are matrices). However, matrix operations do not usually follow the commutative law (i.e. order matters!)
Many of the problems we will solve in 10.10 can be written in this simple form:
A*x = b
where A is a matrix full of coefficients that give the conservation laws for our system, the x is a list of our unknowns, and list of numbers that describes our specific problem. Chapter 7 lays the groundwork we will need when we develop general computer methods for solving these kinds of equations in Chapter 8.
Section 7.2 presents several different ways to looking at the simple Matrix-Vector product; all of them are useful (for various applications), it is worth the effort it takes to get comfortable with each different view. Later in section 7.2.2, the rules for Matrix-Matrix multiplication are presented; you need to know these cold. You probably have not seen Matrix Norms before; these are explained in 7.2.4. Note that matrices are primarily defined by what they “do to” vectors, e.g. the norm of matrix A is related to how different the matrix product A*x is from the original vector x.
W = c1 V1 + c2 V2 + c3 V3 + … + cn Vn
then W is linearly dependent on the V’s. If there is no set of c’s that makes this equation hold, then W is linearly independent of the V’s. If there is only one vector V1, then W is linearly independent unless W and V are collinear.
If there are two vectors V1 and V2, W is linearly dependent if it lies in the plane defined by V1 and V2; if it points out of that plane it is linearly independent. In this case, the plane defined by V1 and V2 is a “subspace” of the full vector space, i.e. it defines a subset of the set of all possible vectors. All the vectors that are linearly dependent on V1 and V2 will lie in the plane; we say they all “lie in the subspace defined by V1 and V2”.
Since a plane is a two-dimensional object, it takes 2 vectors to define it; in linear algebra lingo we say the 2 vectors “span” the subspace. Note that in order for these two vectors to define a plane, they must not be collinear, i.e. one cannot be linearly dependent on each other. Generalizing to any number of dimensions: “n linearly independent vectors span an n-dimensional subspace (notated Rn)”. If each vector has m components, then we are working with an m-dimensional vector space (notated Rm); if n<m, Rn is a subset of Rm. A set of m linearly independent vectors spans the full vector space Rm; if you have more than m vectors, each with m components, at least one of them must be linearly dependent on the others.
The number of linearly independent column vectors is called the rank of a matrix, section 7.3.4. It turns out (though I couldn’t prove it) that this is always equal to the number of linearly independent row vectors as well. The rank of any matrix can be conveniently computed using the Matlab function “rank”. If some of the column vectors are linearly dependent on the others, the matrix is called “rank deficient”; in chemical engineering problems with rank-deficient matrices are frequently associated with multiple or poorly determined solutions, or with errors in setting up the problem. (Unfortunately, hapless scientists and engineers have frequently used these crummy solutions without recognizing the problem, with disastrous results. Don’t let it happen to you!)
y=A*V1
we can be sure that
A*(V1+W) = A*V1 + A*W = A*V1 + 0 = A*V1 = y
So if we define V2 = V1 + W, we know that both of these equations are true:
y=A*V2
y=A*V1
This sort of thing leads to non-unique solutions to some chemical engineering problems.
Next
Chapter: Systems of Linear Equations
Back to Study Guide
Index
Back to 10.10 home page
Last modified: September 19, 2002