# Vector spaces: span, linear independence and basis

References: edX online course MIT 8.05.1x Week 3.

Sheldon Axler (2015), Linear Algebra Done Right, 3rd edition, Springer. Chapter 2.

Here, we investigate the ideas of the span of a vector space and see how this leads to the idea of linear independence of a set of vectors. I’ll summarize the main definitions and results here for future use; a more complete explanation together with some examples is given in Axler’s book, Chapter 2.

Span of a list of vectors

A list of vectors is just a subset of the vectors in a vector space, with the condition that the number of vectors in the subset is finite. The set of all linear combinations of the vectors ${\left(v_{1},\ldots,v_{m}\right)}$ in a list is called the span of that list. Since a general linear combination has the form

$\displaystyle v=\sum_{i=1}^{m}a_{i}v_{i} \ \ \ \ \ (1)$

where ${a_{i}\in\mathbb{F}}$ (recall that the field ${\mathbb{F}}$ is always taken to be either the real numbers ${\mathbb{R}}$ or the complex numbers ${\mathbb{C}}$), the span of a list itself forms a vector space which is a subspace of the original vector space. One result we can show is

Theorem 1 The span of a list of vectors in a vector space ${V}$ is the smallest subspace of ${V}$ containing all the vectors in the list.

Proof: Let the list be ${L\equiv\left(v_{1},\ldots,v_{m}\right)}$. Then ${S\equiv\mbox{span}\left(v_{1},\ldots,v_{m}\right)}$ is a subspace since it contains the zero vector if all ${a_{i}}$s are zero in 1, and since it contains all linear combinations of the list, it is closed under addition and scalar multiplication.

The span ${S}$ contains all ${v_{j}\in L}$ (just set ${a_{j}=\delta_{ij}}$ in 1). Now if we look at a subspace of ${V}$ that contains all the ${v_{i}}$s, it must also contain every vector in the span ${S}$, since a subspace must be closed under addition and scalar multiplication. Thus ${S}$ is the smallest subspace of ${V}$ that contains all the vectors in ${L}$. $\Box$

If ${S\equiv\mbox{span}\left(v_{1},\ldots,v_{m}\right)=V}$, that is, the span of a list is the same as the original vector space, then we say that ${\left(v_{1},\ldots,v_{m}\right)}$ spans ${V}$. This leads to the definition that a vector space is called finite-dimensional if it is spanned by some list of vectors. (Remember that all lists are finite in length!) A vector space that is not finite-dimensional is called (not surprisingly) infinite-dimensional.

Linear independence

Suppose a list ${\left(v_{1},\ldots,v_{m}\right)\in V}$ and ${v}$ is a vector such that ${v\in\mbox{span}\left(v_{1},\ldots,v_{m}\right)}$. This means that ${v}$ is a linear combination of ${\left(v_{1},\ldots,v_{m}\right)}$, so that 1 is true. However, using only the definitions above, there is no guarantee that there is only one choice for the scalars ${a_{i}}$ that satisfies 1. We might also have, for example

$\displaystyle v=\sum_{i=1}^{m}c_{i}v_{i} \ \ \ \ \ (2)$

where ${c_{i}\ne a_{i}}$. This means that we can write the zero vector as

$\displaystyle 0=\sum_{i=1}^{m}\left(a_{i}-c_{i}\right)v_{i} \ \ \ \ \ (3)$

Now, if the only way we can satsify this equation is to require that ${a_{i}=c_{i}}$ for all ${i}$, then we say that the list ${\left(v_{1},\ldots,v_{m}\right)}$ is linearly independent. (For completeness, the empty list (containing no vectors) is also declared to be linearly independent.) By reversing the above argument, we see that if the list ${\left(v_{1},\ldots,v_{m}\right)}$ is linearly independent, then there is only one set of scalars ${a_{i}}$ such that 1 is satisfied. In other words, any vector ${v\in\mbox{span}\left(v_{1},\ldots,v_{m}\right)}$ has only one representation as a linear combination of the vectors in the list.

A list that is not linearly independent is, again not surprisingly, defined to be linearly dependent. This leads to the linear dependence lemma:

Lemma 2 Suppose ${\left(v_{1},\ldots,v_{m}\right)}$ is a linearly dependent list in ${V}$. Then there exists some ${j\in\left\{ 1,2,\ldots,m\right\} }$ such that

(a) ${v_{j}\in\mbox{span}\left(v_{1},\ldots,v_{j-1}\right)}$;

(b) if ${v_{j}}$ is removed from the list ${\left(v_{1},\ldots,v_{m}\right)}$, the span of the remaining list, containing ${m-1}$ vectors, equals the span of the original list.

Proof: Because ${\left(v_{1},\ldots,v_{m}\right)}$ is linearly dependent, we can write

$\displaystyle \sum_{i=1}^{m}a_{i}v_{i}=0 \ \ \ \ \ (4)$

where not all of the ${a_{i}}$s are zero. Suppose ${j}$ is the largest index where ${a_{j}\ne0.}$ Then we can divide through by ${a_{j}}$ to get

$\displaystyle v_{j}=-\frac{1}{a_{j}}\sum_{i=1}^{j-1}a_{i}v_{i} \ \ \ \ \ (5)$

Thus ${v_{j}}$ is a linear combination of other vectors in the list, which proves part (a). Part (b) follows from the fact that we can represent any vector ${u\in\mbox{span}\left(v_{1},\ldots,v_{m}\right)}$ as

$\displaystyle u=\sum_{i=1}^{m}a_{i}v_{i} \ \ \ \ \ (6)$

We can replace ${v_{j}}$ in this sum by 5, so ${u}$ can be written as a linear combination of all the vectors in the list ${\left(v_{1},\ldots,v_{m}\right)}$ except for ${v_{j}}$. Thus (b) is true. $\Box$

We can use this lemma to prove the main result about linearly independent lists:

Theorem 3 In a finite-dimensional vector space ${V}$, the length of every linearly independent list is less than or equal to the length of every list that spans ${V}$.

Proof: Suppose the list ${A\equiv\left(u_{1},\ldots,u_{m}\right)}$ is linearly independent in ${V}$, and suppose another list ${B\equiv\left(w_{1},\ldots,w_{n}\right)}$ spans ${V}$. We want to prove that ${m\le n}$.

Since ${B}$ already spans ${V}$, if we add any other vector from ${V}$ to the list ${B}$, we will get a linearly dependent list, since this newly added vector can, by the definition of a span, be expressed a linear combination of the vectors in ${B}$. In particular, if we add ${u_{1}}$ from the list ${A}$ to ${B}$, then the list ${\left(u_{1},w_{1},\ldots,w_{n}\right)}$ is linearly dependent. By the linear independence lemma above, we can therefore remove one of the ${w_{i}}$s from ${B}$ so that the remaining list still spans ${V}$, and contains ${n}$ vectors. For the sake of argument, let’s say we remove ${w_{n}}$ (we can always order the ${w_{i}}$s in the list so that the element we remove is at the end). Then we’re left with the revised list ${B_{1}=\left(u_{1},w_{1},\ldots,w_{n-1}\right)}$.

We can repeat this process ${m}$ times, each time adding the next element ${u_{i}}$ from list ${A}$ and removing the last ${w_{i}}$. Because of the linear dependence lemma, we know that there must always be a ${w_{i}}$ that can be removed each time we add a ${u_{i}}$, so there must be at least as many ${w_{i}}$s as ${u_{i}}$s. In other words, ${m\le n}$ which is what we wanted to prove. $\Box$

This theorem can be used to show easily that any list of more than ${n}$ vectors in ${n}$-dimensional space cannot be linearly independent, since we know that we can span ${n}$-dimensional space with ${n}$ vectors (for example, the 3 coordinate axes in 3-d space). Conversely, since we can find a list of ${n}$ vectors in ${n}$-dimensional space that is linearly independent, any list of fewer than ${n}$ vectors cannot span ${n}$-dimensional space.

Basis of a finite-dimensional vector space

A basis of a finite-dimensional vector space is defined to be a list that is both linearly independent and spans the space. The dimension of the vector space is defined to be the length of a basis list. For example, in 3-d space, the list ${\left\{ \left(1,0,0\right),\left(0,1,0\right),\left(0,0,1\right)\right\} }$ is a basis, and since the length is 3, the dimension of the vector space is also 3. Any proper subset (that is, a subset with fewer than 3 members) of this basis is also linearly independent, but it does not span the space so is not a basis. For example, the list ${\left\{ \left(1,0,0\right),\left(0,1,0\right)\right\} }$ is linearly independent, but spans only the ${xy}$ plane.

A couple of examples of linear independence/dependence can be found here.