10.10 Introduction to Chemical Engineering

NMM Chapter 7: Linear Algebra

Overview

The material- and energy- balance equations are linear in the variables, i.e. they don’t involve any quadratic or transcendental equations. However, there may be a lot of unknowns, so we will want to solve these linear systems of equations in a systematic way, and use the computer to help us. NMM Chapters 7 and 8 present the very convenient and general matrix-vector notation for systems of linear equations, and a few simple theorems which tell us whether or not a given system of linear equations has a unique solution.

There is of course much more to linear algebra; to really learn it we strongly recommend the texts by MIT Prof. G. Strang, and the subject 18.06. In 10.10 we will try to rapidly survey a few of the most important and relevant results, which will help us to understand how the computer is solving the equations, why it doesn’t always work, and (if that happens) how to diagnose the problem.


Matrix-Vector Arithmetic

Section 7.1 reviews the rules for vector arithmetic, most of this should just be review for you. You may not have seen outer products (the end of Section 7.1.1), nor all the properties of vector norms presented in 7.1.2

The vector-vector and matrix-vector inner products follow the simple distributive laws:

(V1 + V2)*(V3+V4) = V1*V3 + V1*V4 + V2*V3 + V2*V4

(A + B) * (V1 + V2) = A*V1 + A*V2 + B*V1 + B*V2

(V’s are vectors, A & B are matrices). However, matrix operations do not usually follow the commutative law (i.e. order matters!)

Many of the problems we will solve in 10.10 can be written in this simple form:

A*x = b

where A is a matrix full of coefficients that give the conservation laws for our system, the x is a list of our unknowns, and list of numbers that describes our specific problem. Chapter 7 lays the groundwork we will need when we develop general computer methods for solving these kinds of equations in Chapter 8.

Section 7.2 presents several different ways to looking at the simple Matrix-Vector product; all of them are useful (for various applications), it is worth the effort it takes to get comfortable with each different view. Later in section 7.2.2, the rules for Matrix-Matrix multiplication are presented; you need to know these cold. You probably have not seen Matrix Norms before; these are explained in 7.2.4. Note that matrices are primarily defined by what they “do to” vectors, e.g. the norm of matrix A is related to how different the matrix product A*x is from the original vector x.


Linear Dependency and Vector Subspaces

You’ll need to learn the Linear Algebra notation given in 7.3.1-7.3.4 in order to appreciate the concepts.  The key idea in Linear Algebra, described in Section 7.3, is that a vector W can either be linearly independent of a set of other vectors {V1,V2,V3…, Vn}, or it can be linearly dependent. If it is possible to write

W = c1 V1 + c2 V2 + c3 V3 + … + cn Vn

then W is linearly dependent on the V’s. If there is no set of c’s that makes this equation hold, then W is linearly independent of the V’s. If there is only one vector V1, then W is linearly independent unless W and V are collinear.

If there are two vectors V1 and V2, W is linearly dependent if it lies in the plane defined by V1 and V2; if it points out of that plane it is linearly independent. In this case, the plane defined by V1 and V2 is a “subspace” of the full vector space, i.e. it defines a subset of the set of all possible vectors. All the vectors that are linearly dependent on V1 and V2 will lie in the plane; we say they all “lie in the subspace defined by V1 and V2”.

Since a plane is a two-dimensional object, it takes 2 vectors to define it; in linear algebra lingo we say the 2 vectors “span” the subspace. Note that in order for these two vectors to define a plane, they must not be collinear, i.e. one cannot be linearly dependent on each other. Generalizing to any number of dimensions: “n linearly independent vectors span an n-dimensional subspace (notated Rn)”. If each vector has m components, then we are working with an m-dimensional vector space (notated Rm); if n<m, Rn is a subset of Rm. A set of m linearly independent vectors spans the full vector space Rm; if you have more than m vectors, each with m components, at least one of them must be linearly dependent on the others.


Subspaces Associated with a Matrix: 1. Column Space and Rank

You can treat each of the columns (or rows) of a matrix as a vector, section 7.3.3, and then ask whether or not those vectors are linearly independent, and what subspaces these vectors define. For a matrix with m rows and n columns, the n column vectors define a subspace of Rm. All vectors y defined by y=A*x are a linear combination of these column vectors, and so must lie in this subspace; if someone claims b=A*x for some vector b that does not lie in this subspace they are lying (this will be crucial for us.)

The number of linearly independent column vectors is called the rank of a matrix, section 7.3.4. It turns out (though I couldn’t prove it) that this is always equal to the number of linearly independent row vectors as well. The rank of any matrix can be conveniently computed using the Matlab function “rank”. If some of the column vectors are linearly dependent on the others, the matrix is called “rank deficient”; in chemical engineering problems with rank-deficient matrices are frequently associated with multiple or poorly determined solutions, or with errors in setting up the problem. (Unfortunately, hapless scientists and engineers have frequently used these crummy solutions without recognizing the problem, with disastrous results. Don’t let it happen to you!)


Subspaces Associated with a Matrix: 2. Null Space

You can define a different subspace called the null space (NMM 7.3.3) by finding all the vectors x that satisfy A*x = 0. (Actually, this could be pretty difficult to do, but Matlab has a function “null” which does it for you.) Suppose this null space includes the vector W. Then if

y=A*V1

we can be sure that

A*(V1+W) = A*V1 + A*W = A*V1 + 0 = A*V1 = y

So if we define V2 = V1 + W, we know that both of these equations are true:

y=A*V2
y=A*V1

This sort of thing leads to non-unique solutions to some chemical engineering problems.


Other Material in NMM Chapter 7

In 10.10 we will not use determinants (NMM section 7.3.5). The rest of the chapter is a bunch of linear algebra definitions that may be helpful, but are not essential. If you are pressed for time, go right on to Chapter 8.

Next Chapter: Systems of Linear Equations
Back to Study Guide Index
Back to 10.10 home page
Last modified: September 19, 2002