matrix
A matrix is simply a mapping of the product of twosets into some third set.As a rule, though, the word matrix and the notation associated with itare used only in connection with linear mappings.In such cases is the ring or field of scalars.
Matrix of a linear mapping
Definition: Let and be finite-dimensional vectorspaces over the same field , with bases and respectively,and let be a linear mapping.For each let be the unique family of scalars(elements of ) such that
Then the family (or equivalently the mapping of )is called the matrix of with respect to the given bases and .The scalars are called the components of the matrix. The matrix is said to be of size -by- or simply x matrix.
The matrix describes the function completely; for any element
of , we have
as is readily verified.
Any two linear mappings have a sum, defined pointwise; itis easy to verify that the matrix of the sum is the sum, componentwise,of the two given matrices.
The formalism of matrices extends somewhat to linear mappings betweenmodules, i.e. extends to a ring , not necessarily commutative,rather than just a field.
Rows and columns; product of two matrices
Suppose we are given three modules , with bases respectively,and two linear mappings and . and have some matrices and with respect tothose bases. The product matrix is defined as the matrix of the function
with respect to the bases and . Straight from the definitionsof a linear mapping and a basis, one verifies that
(1) |
for all and .
To illustrate the notation of matrices in terms of rows and columns,suppose the spaces have dimensions 2, 3, and 2 respectively, and bases
We write
(Notice that we have taken a liberty with the notation,by writing e.g. instead of .)The equation (1) shows that the multiplicationof two matrices proceeds “rows by columns”. Also, in an expression suchas , the first index refers to the row, and the second to thecolumn, in which that component appears.
Similar notation can describe the calculation of whenever is a linear mapping. For example, if is linear,and and , we write
When, as above, a “row vector” denotes an element of a space,a “column vector” denotes an element of the dual space.If, say, is the transpose
of ,then, with respect to the bases dual to and ,an equation may be written
One more illustration: Given a bilinear form ,we can denote by
Square matrix
A matrix is called square if , or if somebijection is implicit in the context.(It is not enough for and to be equipotent.)Square matrices naturally arise in connection with a linear mapping ofa space into itself (called an endomorphism), and in the relatedcase of a change of basis (from one basis of some space, to anotherbasis of the same space). When is finite of cardinality (and thus, so is ), then is often called the order of the matrix . Unfortunately, equally often order of means the order (http://planetmath.org/OrderGroup) of as an element of the group (http://planetmath.org/GeneralLinearGroup).
Miscelleous usages of “matrix”
The word matrix has come into use in some areas where linear mappingsare not at issue. An example would be a combinatorical statement,such as Hall’s marriage theorem, phrased in terms of “0-1 matrices”instead of subsets of .
Remark
Matrices are heavily used in the physical sciences, engineering,statistics, and computer programming. But for purely mathematical purposes,they are less important than one might expect, and indeed are frequentlyirrelevant in linear algebra. Linear mappings, determinants, traces,transposes, and a number of other simple notions can and should bedefined without matrices, simply because they have a meaning independentof any basis or bases.Many little theorems in linear algebra can be proved in a simplerand more enlightening way without matrices than with them.One more illustration: The derivative (at a point)of a mapping from one surface to another is a linear mapping; it is not amatrix of partial derivatives
, because the matrix depends on a choiceof basis but the derivative does not.