Linear AlgebraLinear Independence
The idea of redundancy that we discussed in the introduction can now be phrased in a mathematically precise way: a
A list of vectors which is not linearly dependent is said to be linearly independent. In other words, a list of vectors is linearly independent if
The list of vectors where is not linearly independent, since .
The list of vectors where is linearly independent, since any linear combination of and is unequal to , and similarly for and .
Explain geometrically why a list of three vectors in is necessarily linearly dependent.
Solution. If any vector in the list is zero, then the list is linearly independent, since the zero vector can be written as the sum of zero times each of the other vectors. So we may assume that the vectors are
If the first two vectors point in the same direction, then the list is linearly
Linear dependence lemma
The definition of linear independence makes it seem as though there's quite a lot to check: if there is a vector in the list which can be written as a linear combination of some of the other ones, which one is it, and which other vectors are involved? In fact, the symmetry involved in linear relationships implies that we can put the vectors in any order we want and work through the list, checking whether each vector is in the span of the vectors earlier in the list:
Theorem (Linear dependence lemma)
A list of vectors is linearly independent if and only if there is no vector in the list which is in the span of the preceding vectors.
For example, to check that is linearly independent, it suffices to check that , that is not a scalar multiple of and that is not in the span of .
Let's walk through a proof of this theorem.
Proof. If a list is linearly independent, then no vector in the list can be represented as a linear combination of others (by definition), so no vector can be in the span of the previous ones. This shows that linear independence
For the other direction, suppose that the list is linearly dependent. Then, one of the vectors can be written as a linear combination of the others. For example, if can be written as a linear combination of the others, then
for some weights . If all of the weights are zero, then is zero and is therefore in the span of the empty list of vectors which precede it. If at least one is nonzero, then let's define so that is the
which is in the span of .
So the list does not satisfy the condition of having no vector in the span of the preceding ones. Similar reasoning would apply if we had chosen any vector other than as the one which can be written as a linear combination of the others. Therefore, we conclude that linear independence does imply failure to satisfy the given condition.
From logic, we know that "A implies B" is equivalent to its
Let's say that a linear combination of a list of vectors is trivial if all of the weights are zero.
Show that a list of vectors is linearly independent if and only if every nontrivial linear combination of the vectors is not equal to the zero vector.
Solution. Suppose that a list of vectors is not linearly independent. Then one of the vectors, say the first one, is equal to some linear combination of the others:
Subtracting from both sides of this equation, we obtain a nontrivial linear combination of the 's which is equal to
Conversely, suppose that there is a nontrivial linear combination of the 's which is equal to the zero vector:
At least one of the weights must be nonzero, so we can solve this equation for a least one of the vectors and thereby represent it as