Determinants

It is, of course, possible to generalize the considerations of the preceding section to multilinear forms and multiple tensor products. Instead of entering into that part of multilinear algebra, we proceed in a different direction; we go directly after determinants.

Suppose that is a linear transformation on an -dimensional vector space and let be an alternating -linear form on . If we write for the function defined by then is an alternating -linear form on , and, in fact, is a linear transformation on the space of such forms. Since (see Section: Alternating forms of maximal degree ) that space is one-dimensional, it follows that is equal to multiplication by an appropriate scalar. In other words, there exists a scalar such that for every alternating -linear form . By this somewhat roundabout procedure (from to to ) we have associated a uniquely determined scalar with every linear transformation on ; we call the determinant of , and we write . Observe that is neither a scalar nor a transformation, but a function that associates a scalar with each linear transformation.

Our immediate purpose is to study the function . We begin by finding the determinants of the simplest linear transformations, that is, the multiplications by scalars. If for every in , then for every alternating -linear form ; it follows that . We note, in particular, that and .

Next we ask about the multiplicative properties of . Suppose that and are linear transformations on , and write . If is an alternating -linear form, then so that . Since and it follows that (The values of are scalars, and therefore commute with each other.)

A linear transformation is called singular if and non-singular otherwise. Our next result is that is invertible if and only if it is non-singular. Indeed, if is invertible, then and therefore . Suppose, on the other hand, that . If is a basis in , and if is a non-zero alternating -linear form on , then by Section: Alternating forms , Theorem 3. This implies, by Section: Alternating forms , Theorem 2, that the set is linearly independent (and therefore a basis); from this, in turn, we infer that is invertible.

In the classical literature determinant is defined as a function of matrices (not linear transformations); we are now in a position to make contact with that approach. We shall derive an expression for in terms of the elements of the matrix corresponding to in some coordinate system . Let be a non-zero alternating -linear form; we know that If we replace each in the right side of (1) by and expand the result by multilinearity, we obtain a long linear combination of terms such as , where each is one of the ’s. (Compare this part of the argument with the proof of Section: Alternating forms , Theorem 3.) If, in such a term, two of the ’s coincide, then, since is alternating, that term must vanish. If, on the other hand, all the ’s are distinct, then for some permutation , and, moreover, every permutation can occur in this way. The coefficient of the term is the product . Since ( Section: Alternating forms , Theorem 1) is skew symmetric, it follows that where the summation is extended over all permutations in . (Recall that , by Section: Alternating forms , Theorem 3, so that division by is legitimate.)

From this classical equation (2) we could derive many special properties of determinants by straightforward computation. Here is one example. If and are permutations (in ), then (since is also a permutation), it follows that the products and differ in the order of their factors only. If, for each , we take , and then alter each summand in (2) accordingly, we obtain (Note that and that the sum over all is the same as the sum over all .) Since this last sum is just like the sum in (2), except that appears in place of , it follows from an application of (2) to in place of that

Here is another useful fact about determinants. If is a subspace invariant under , if is the transformation considered on only, and if is the quotient transformation , then . This multiplicative relation holds if, in particular, is the direct sum of two transformations and . The proof can be based directly on the definition of determinants, or, alternatively, on the expansion obtained in the preceding paragraph.

If, for a fixed linear transformation , we write , then is a function of the scalar ; we assert that it is, in fact, a polynomial of degree in , and that the coefficient of is . For the proof we may use the notation of (1). It is easy to see that is a sum of terms such as , where for exactly values of and for the remaining values of ( ). The polynomial is called the characteristic polynomial of ; the equation , that is, , is the characteristic equation of . The roots of the characteristic equation of (that is, the scalars such that ) are called the characteristic roots of .

EXERCISES

Exercise 1. Use determinants to get a new proof of the fact that if and are linear transformations on a finite-dimensional vector space, and if , then both and are invertible.

Exercise 2. If and are linear transformations such that , , , then .

Exercise 3. Suppose that is a non-singular -by- matrix, and suppose that are linear transformations (on the same vector space). Prove that if the linear transformations , , commute with each other, then the same is true of .

Exercise 4. If and are bases in the same vector space, and if is a linear transformation such that , , then .

Exercise 5. Suppose that is a basis in a finite-dimensional vector space . If are vectors in , write for the determinant of the linear transformation such that , . Prove that is an alternating -linear form.

Exercise 6. If, in accordance with Section: Determinants , (2), the determinant of a matrix (not a linear transformation) is defined to be , then, for each linear transformation , the determinants of all the matrices are all equal to each other. (Here is an arbitrary basis.)

Exercise 7. If is an -by- matrix such that for more than pairs of values of and , then .

Exercise 8. If and are linear transformations on vector spaces of dimensions and , respectively, then

Exercise 9. If , , , and are matrices such that and commute and is invertible, then (cf. Section: Matrices of transformations , Ex. 19) (Hint: multiply on the right by .) What if is not invertible? What if and do not commute?

Exercise 10. Do and always have the same characteristic polynomial?

Exercise 11. 

  1. If and are similar, then .
  2. If and are similar, then and have the same characteristic polynomial.
  3. If and have the same characteristic polynomial, then .
  4. Is the converse of any of these assertions true?

Exercise 12. Determine the characteristic polynomial of the matrix (or, rather, of the linear transformation defined by the matrix) and conclude that every polynomial is the characteristic polynomial of some linear transformation.

Exercise 13. Suppose that and are linear transformations on the same finite-dimensional vector space.

  1. Prove that if is a projection, then and have the same characteristic polynomial. (Hint: choose a basis that makes the matrix of as simple as possible and then compute directly with matrices.)
  2. Prove that, in all cases, and have the same characteristic polynomial. (Hint: find an invertible such that is a projection and apply (a) to and .)