Let us now pick up the loose threads; having introduced the new concept of linear transformation, we must now find out what it has to do with the old concepts of bases, linear functionals, etc.
One of the most important tools in the study of linear transformations on finite-dimensional vector spaces is the concept of a matrix. Since this concept usually has no decent analogue in infinite-dimensional spaces, and since it is possible in most considerations to do without it, we shall try not to use it in proving theorems. It is, however, important to know what a matrix is; we enter now into the detailed discussion.
Definition 1. Let
This definition does not define "matrix"; it defines "the matrix associated under certain conditions with a linear transformation." It is often useful to consider a matrix as something existing in its own right as a square array of scalars; in general, however, a matrix in this book will be tied up with a linear transformation and a basis.
We comment on notation. It is customary to use the same symbol, say,
We call attention also to a peculiarity of the indexing of the elements
Everything we shall say about matrices can, accordingly, be interpreted from two different points of view; either in strict accordance with the letter of our definition, or else following a modified definition which makes correspond a matrix (with ordered rows and columns) not merely to a linear transformation and a basis, but also to an ordering of the basis.
One more word to those in the know. It is a perversity not of the author, but of nature, that makes us write
instead of the more usual equation
The reason is that we want the formulas for matrix multiplication and for the application of matrices to numerical vectors (that is, vectors
For an example we consider the differentiation transformation
The unpleasant phenomenon of indices turning around is seen by comparing (1) and (2).