Tensor products of transformations

Let us now tie up linear transformations with the theory of tensor products. Let and be finite-dimensional vector spaces (over the same field), and let and be any two linear transformations on and respectively. We define a linear transformation on the space of all bilinear forms on by writing The tensor product of the transformations and is, by definition, the dual of the transformation , so that whenever is in and is in . If we apply to an element of the form (recall that this means that for all in ), we obtain We infer that Since there are quite a few elements in of the form , enough at any rate to form a basis (see Section: Product bases ), this relation characterizes .

The formal rules for operating with tensor products go as follows. The proofs of all these relations, except perhaps the last two, are straightforward.

Formula (7), as all formulas involving inverses, has to be read with caution. It is intended to mean that if both and are invertible, then so is , and the equation holds, and, conversely, that if is invertible, then so also are and . We shall prove (7) and (8) in reverse order.

Formula (8) follows from the characterization (1) of tensor products and the following computation: As an immediate consequence of (8) we obtain

To prove (7), suppose that and are invertible, and form and . Since, by (8), the product of these two transformations, in either order, is , it follows that is invertible and that (7) holds. Conversely, suppose that is invertible. Remembering that we defined tensor products for finite-dimensional spaces only, we may invoke Section: Inverses , Theorem 2; it is sufficient to prove that implies that and implies that . We use (1): If either factor on the left is zero, then , whence , so that either or . Since (by (2)) is impossible, we may find a vector so that . Applying the above argument to this , with any for which , we conclude that . The same argument with the roles of and interchanged proves that is invertible.

An interesting (and complicated) side of the theory of tensor products of transformations is the theory of Kronecker products of matrices. Let and be bases in and , and let and be the matrices of and . What is the matrix of in the coordinate system ?

To answer the question, we must recall the discussion in Section: Matrices concerning the arrangement of a basis in a linear order. Since, unfortunately, it is impossible to write down a matrix without being committed to an order of the rows and the columns, we shall be frank about it, and arrange the times vectors in the so-called lexicographical order, as follows: We proceed also to carry out the following computation: This process indicates exactly how far we can get without ordering the basis elements; if, for example, we agree to index the elements of a matrix not with a pair of integers but with a pair of pairs, say and , then we know now that the element in the row and the column is . If we use the lexicographical ordering, the matrix of has the form In a condensed notation whose meaning is clear we may write this matrix as

This matrix is known as the Kronecker product of and , in that order. The rule for forming it is easy to describe in words: replace each element of the -by- matrix by the -by- matrix . If in this rule we interchange the roles of and (and consequently interchange and ) we obtain the definition of the Kronecker product of and .

EXERCISES

Exercise 1. We know that the tensor product of and may be identified with the space of polynomials in two variables (see Section: Product bases , Ex. 2). Prove that if and are differentiation on and respectively, and if , then is mixed partial differentiation, that is, if is in , then .

Exercise 2. With the lexicographic ordering of the product basis it turned out that the matrix of is the Kronecker product of the matrices of and . Is there an arrangement of the basis vectors such that the matrix of , referred to the coordinate system so arranged, is the Kronecker product of the matrices of and (in that order)?

Exercise 3. If and are linear transformations, then