Matrices of transfomrations

There is now a certain amount of routine work to be done, most of which we shall leave to the imagination. The problem is this: in a fixed coordinate system , knowing the matrices of and , how can we find the matrices of , of , of , , etc.?

Write , , , ; we assert that also if and , then and

A more complicated rule is the following: if , , then To prove this we use the definition of the matrix associated with a transformation, and juggle, thus:

The relation between transformations and matrices is exactly the same as the relation between vectors and their coordinates, and the analogue of the isomorphism theorem of Section: Isomorphism is true in the best possible sense. We shall make these statements precise.

With the aid of a fixed basis , we have made correspond a matrix to every linear transformation ; the correspondence is described by the relations . We assert now that this correspondence is one-to-one (that is, that the matrices of two different transformations are different), and that every array of scalars is the matrix of some transformation. To prove this, we observe in the first place that knowledge of the matrix of completely determines (that is, that is thereby uniquely defined for every ), as follows: if , then (In other words, if , then Compare this with the comments in Section: Matrices on the perversity of indices.) In the second place, there is no law against reading the relation backwards. If, in other words, is any array, we may use this relation to define a linear transformation ; it is clear that the matrix of will be exactly . (Once more, however, we emphasize the fundamental fact that this one-to-one correspondence between transformations and matrices was set up by means of a particular coordinate system, and that, as we pass from one coordinate system to another, the same linear transformation may correspond to several matrices, and one matrix may be the correspondent of many linear transformations.) The following statement sums up the essential part of the preceding discussion.

Theorem 1. Among the set of all matrices , , etc., (not considered in relation to linear transformations), we define sum, scalar multiplication, product, , and , by Then the correspondence (established by means of an arbitrary coordinate system of the -dimensional vector space ), between all linear transformations on and all matrices , described by , is an isomorphism; in other words, it is a one-to-one correspondence that preserves sum, scalar multiplication, product, , and .

We have carefully avoided discussing the matrix of . It is possible to give an expression for in terms of the elements of , but the expression is not simple and, fortunately, not useful for us.

EXERCISES

Exercise 1. Let be the linear transformation on defined by , and let be the basis of defined by , . Find the matrix of with respect to this basis.

Exercise 2. Find the matrix of the operation of conjugation on , considered as a real vector space, with respect to the basis (where ).

Exercise 3. 

  1. Let be a permutation of the integers ; if is a vector in , write . If , find the matrix of with respect to .
  2. Find all matrices that commute with the matrix of .

Exercise 4. Consider the vector space consisting of all real two-by-two matrices and let be the linear transformation on this space that sends each matrix onto , where Find the matrix of with respect to the basis consisting of

Exercise 5. Consider the vector space consisting of all linear transformations on a vector space , and let be the (left) multiplication transformation that sends each transformation on onto , where is some prescribed transformation on . Under what conditions on is invertible?

Exercise 6. Prove that if , , and are the complex matrices respectively (where ), then

Exercise 7. 

  1. Prove that if , , and are linear transformations on a two-dimensional vector space, then commutes with .
  2. Is the conclusion of (a) true for higher-dimensional spaces?

Exercise 8. Let be the linear transformation on defined by . Prove that if a linear transformation commutes with , then there exists a polynomial such that .

Exercise 9. For which of the following polynomials and matrices is it true that ?

  1. , .
  2. , .
  3. , .
  4. , .

Exercise 10. Prove that if and are the complex matrices respectively (where ), and if , then .

Exercise 11. If and are linear transformations on a vector space, and if , does it follow that ?

Exercise 12. What happens to the matrix of a linear transformation on a finite-dimensional vector space when the elements of the basis with respect to which the matrix is computed are permuted among themselves?

Exercise 13. 

  1. Suppose that is a finite-dimensional vector space with basis . Suppose that are pairwise distinct scalars. If is a linear transformation such that , , and if is a linear transformation that commutes with , then there exist scalars such that .
  2. Prove that if is a linear transformation on a finite-dimensional vector space and if commutes with every linear transformation on , then is a scalar (that is, there exists a scalar such that for all in ).

Exercise 14. If and are linearly independent sets of vectors in a finite-dimensional vector space , then there exists an invertible linear transformation on such that , .

Exercise 15. If a matrix is such that , , then there exist matrices and such that . (Hint: try .)

Exercise 16. Decide which of the following matrices are invertible and find the inverses of the ones that are.

  1. .
  2. .
  3. .
  4. .
  5. .
  6. .
  7. .

Exercise 17. For which values of are the following matrices invertible? Find the inverses whenever possible.

  1. .
  2. .
  3. .
  4. .

Exercise 18. For which values of are the following matrices invertible? Find the inverses whenever possible.

  1. .
  2. .
  3. .
  4. .

Exercise 19. 

  1. It is easy to extend matrix theory to linear transformations between different vector spaces. Suppose that and are vector spaces over the same field, let and be bases of and respectively, and let be a linear transformation from to . The matrix of is, by definition, the rectangular, by , array of scalars defined by Define addition and multiplication of rectangular matrices so as to generalize as many as possible of the results of Section: Matrices of transformations . (Note that the product of an by matrix and an by matrix, in that order, will be defined only if .)
  2. Suppose that and are multipliable matrices. Partition into four rectangular blocks (top left, top right, bottom left, bottom right) and then partition similarly so that the number of columns in the top left part of is the same as the number of rows in the top left part of . If, in an obvious shorthand, these partitions are indicated by then
  3. Use subspaces and complements to express the result of (b) in terms of linear transformations (instead of matrices).
  4. Generalize both (b) and (c) to larger numbers of pieces (instead of four).