The method of least squares explained above makes at least 4 assumptions, the adherence to which
may be checked a posteriori. These assumptions concern with the error
ei =
- yi, i.e., the
difference between the best fit prediction and the observation. The assumptions are that these errors
(a). are mutually independent (b). have zero mean (c). have a constant variance across all the values of the
statistical variables and (d). are normally distributed. Violation of these assumptions can be
identified in many cases by simply examining a plot of ei vs. xi. Note that the first of Eq. 4
guarantees that the mean value of ei is 0 within the precision of the computation (To see this better
rewrite that equation as
(b + axi - yi) = 0). However, this is not necessarily true of non-linear regression
analysis.