Bases

Definition 1. A (linear) basis (or a coordinate system ) in a vector space is a set of linearly independent vectors such that every vector in is a linear combination of elements of . A vector space is finite-dimensional if it has a finite basis.

Except for the occasional consideration of examples we shall restrict our attention, throughout this book, to finite-dimensional vector spaces.

For examples of bases we turn again to the spaces and . In , the set , where , , is a basis; every polynomial is, by definition, a linear combination of a finite number of . Moreover has no finite basis, for, given any finite set of polynomials, we can find a polynomial of higher degree than any of them; this latter polynomial is obviously not a linear combination of the former ones.

An example of a basis in is the set of vectors , , defined by the condition that the -th coordinate of is . (Here we use for the first time the popular Kronecker ; it is defined by if and if .) Thus we assert that in the vectors , , and form a basis. It is easy to see that they are linearly independent; the formula proves that every in is a linear combination of them.

In a general finite-dimensional vector space , with basis , we know that every can be written in the form we assert that the ’s are uniquely determined by . The proof of this assertion is an argument often used in the theory of linear dependence. If we had , then we should have, by subtraction, Since the are linearly independent, this implies that for ; in other words, the ’s are the same as the ’s. (Observe that writing for a basis with elements is not the proper thing to do in case . We shall, nevertheless, frequently use this notation. Whenever that is done, it is, in principle, necessary to adjoin a separate discussion designed to cover the vector space . In fact, however, everything about that space is so trivial that the details are not worth writing down, and we shall omit them.)

Theorem 1. If is a finite-dimensional vector space and if is any set of linearly independent vectors in , then, unless the ’s already form a basis, we can find vectors so that the totality of the ’s, that is, , is a basis. In other words, every linearly independent set can be extended to a basis.

Proof. Since is finite-dimensional, it has a finite basis, say . We consider the set of vectors in this order, and we apply to this set the theorem of Section: Linear combinations several times in succession. In the first place, the set is linearly dependent, since the ’s are (as are all vectors) linear combinations of the ’s. Hence some vector of is a linear combination of the preceding ones; let be the first such vector. Then is different from any , (since the ’s are linearly independent), so that is equal to some , say . We consider the new set of vectors We observe that every vector in is a linear combination of vectors in , since by means of we may express , and then by means of we may express any vector. (The ’s form a basis.) If is linearly independent, we are done. If it is not, we apply the theorem of Section: Linear combinations again and again the same way till we reach a linearly independent set containing , in terms of which we may express every vector in . This last set is a basis containing the ’s. ◻

EXERCISES

Exercise 1. 

  1. Prove that the four vectors in form a linearly dependent set, but any three of them are linearly independent. (To test the linear dependence of vectors , and in , proceed as follows. Assume that , , and can be found so that . This means that The vectors , , and are linearly dependent if and only if these equations have a solution other than .)
  2. If the vectors , , , and in are defined by , , , and , prove that , , , and are linearly dependent, but any three of them are linearly independent.

Exercise 2. Prove that if is considered as a rational vector space (see. Section: Examples , (8)), then a necessary and sufficient condition that the vectors and in be linearly independent is that the real number be irrational.

Exercise 3. Is it true that if , , and are linearly independent vectors, then so also are , , and ?

Exercise 4. 

  1. Under what conditions on the scalar are the vectors and in linearly dependent?
  2. Under what conditions on the scalar are the vectors , , and in linearly dependent?
  3. What is the answer to (b) for (in place of )?

Exercise 5. 

  1. The vectors and in are linearly dependent if and only if .
  2. Find a similar necessary and sufficient condition for the linear dependence of two vectors in . Do the same for three vectors in .
  3. Is there a set of three linearly independent vectors in ?

Exercise 6. 

  1. Under what conditions on the scalars and are the vectors and in linearly dependent?
  2. Under what conditions on the scalars , , and are the vectors , , and in linearly dependent?
  3. Guess and prove a generalization of (a) and (b) to .

Exercise 7. 

  1. Find two bases in such that the only vectors common to both are and .
  2. Find two bases in that have no vectors in common so that one of them contains the vectors and and the other one contains the vectors and .

Exercise 8. 

  1. Under what conditions on the scalar do the vectors and form a basis of ?
  2. Under what conditions on the scalar do the vectors , , and form a basis of ?

Exercise 9. Consider the set of all those vectors in each of whose coordinates is either or ; how many different bases does this set contain?

Exercise 10. If is the set consisting of the six vectors , , , , , in , find two different maximal linearly independent subsets of . (A maximal linearly independent subset of is a linearly independent subset of that becomes linearly dependent every time that a vector of that is not already in is adjoined to .)

Exercise 11. Prove that every vector space has a basis. (The proof of this fact is out of reach for those not acquainted with some transfinite trickery, such as well-ordering or Zorn’s lemma.)