# Linear AlgebraLinear Independence

The idea of *redundancy* that we discussed in the introduction can now be phrased in a mathematically precise way: a **linearly dependent** if one of the vectors can be expressed as a linear combination of the others.

A list of vectors which is not linearly dependent is said to be **linearly independent**. In other words, a list of vectors is linearly independent if

**Example**

The list of vectors where is not linearly independent, since

The list of vectors

**Exercise**

Explain geometrically why a list of three vectors in

*Solution.* If any vector in the list is zero, then the list is linearly independent, since the zero vector can be written as the sum of zero times each of the other vectors. So we may assume that the vectors are

If the first two vectors point in the same direction, then the list is linearly

## Linear dependence lemma

The definition of linear independence makes it seem as though there's quite a lot to check: if there *is* a vector in the list which can be written as a linear combination of some of the other ones, which one is it, and which other vectors are involved? In fact, the symmetry involved in linear relationships implies that we can put the vectors in any order we want and work through the list, checking whether each vector is in the span of the vectors *earlier* in the list:

**Theorem** (Linear dependence lemma)

A list of vectors is linearly independent if and only if there is no vector in the list which is in the span of the *preceding* vectors.

For example, to check that

Let's walk through a proof of this theorem.

*Proof.* If a list is linearly independent, then no vector in the list can be represented as a linear combination of others (by definition), so no vector can be in the span of the previous ones. This shows that linear independence

For the other direction, suppose that the list

for some weights

which is in the span of

So the list does not satisfy the condition of having no vector in the span of the preceding ones. Similar reasoning would apply if we had chosen any vector other than

From logic, we know that "A implies B" is equivalent to its

**Exercise**

Let's say that a linear combination of a list of vectors is **trivial** if all of the weights are zero.

Show that a list of vectors is linearly independent if and only if every nontrivial linear combination of the vectors is not equal to the zero vector.

*Solution.* Suppose that a list of vectors

Subtracting

Conversely, suppose that there is a nontrivial linear combination of the

At least one of the weights must be nonzero, so we can solve this equation for a least one of the vectors and thereby represent it as