We construct new representations from old in various ways, using linear algebra. We continue to work over throughout, though many of these constructions work over any field.
Recall that we earlier defined, for any -representations and , a -representation on the space of linear maps from to . It has character .
If is any vector space, then the dual space of is
We have . To see this, given a basis of , we have the dual basis of given by
There is a bilinear map . The choice of basis identifies with (as column vectors) and then the dual basis realizes as matrices, with the above pairing being the usual matrix/dot product.
If has a -representation , then we take to have the trivial representation and get an action of on defined by . From the formula for the character of , we see
If the matrix of with respect to some basis is , then the matrix of with respect to the dual basis is , the inverse of the transpose of .
Let be two vector spaces. Then the tensor product
is the -vector space generated by the symbols for and , with the “bilinear” relations
Rigorously, we are taking the (infinite dimensional) vector space with a basis element for every and and then forming its quotient by the (infinite dimensional) subspace generated by vectors of the form and by similar expressions corresponding to the other relations; this quotient is then finite dimensional, as the proposition below shows.
Let be a basis of and be a basis of . Then
is a basis of .
Omitted.∎
In particular, the dimension of the tensor product is the product of the dimensions of the vector spaces:
Contrast the direct sum, which has dimension the sum of the dimensions of the vector spaces.
It is not true that every vector in is of the form . For example, if is two-dimensional with basis then cannot be written in this form.
Then next remark is non-examinable.
Tensor products can be difficult to get used to. Perhaps the most important principle for understanding them is the following:
A linear map is the same as a bilinear map .
The dictionary as follows: given a linear map , we define a bilinear map by sending to . The bilinearity is then a consequence of the relations that hold in the tensor product. Conversely, given a bilinear map , define a linear map by sending to . We have to check that this is well-defined, i.e. that the bilinear relations are respected, and this is equivalent to the bilinearity of .
This all seems very abstract, but it is useful: defining bilinear maps is ‘easy’! In fact, tensor products ‘in real life’ often arise in situations where you have a number that depends on the choice of two vectors (in , say); then this dependency can be expressed as a linear map .
All that said, for this course it will not be important to have such a deep theoretical understanding of tensor products as long as you are able to do calculations with them and are willing to take Proposition 2.39 on trust.
Naturally, we can consider more factors , with linearity in each slot.
Now, if and are both representations of then becomes a representation via
We also write for this representation. If we have bases of and of , with respect to which the matrices of acting on and are and , and we order the resulting basis of as
then the matrix of on is where this is the block matrix
If and have characters and , then
Let and choose bases and such that and . Then
so with respect to the basis of , acts diagonally with entries . So
The tensor product generalises the ’twisting’ construction earlier. If is any vector space then is isomorphic to via the map . If is a -dimensional representation and is any representation of , then is a representation acting on . We have
via the above isomorphism so that
Note that if is irreducible, so is . Furthermore, might or might not be isomorphic to .
Let and be two finite-dimensional representations of a group . Then
as -modules.
The best way to prove this is to show that the map , where , is a -isomorphism. This is straightforward but a bit technical.
Another proof which works in our situation is simply to observe that both sides have character . ∎
The symmetric square of is the vector space spanned by symbols subject to the bilinear relations above and, additionally,
for all .
Formally, is the quotient of by the subspace spanned by all elements of the form , and then is the image of in .
Given a basis of , the with are a basis of . ∎
Hence
The alternating square of is spanned by elements of the form subject to the bilinear relations above and, additionally,
for all .
Formally, it is the quotient of by the subspace spanned by all elements of the form
and then is the image of in .
If and are representations of , we define actions of on these spaces as for the tensor product.
We can define linear maps and sending and . Any can be written
and this shows that
In fact this decomposition holds as -representations.
The space has an involution22 2 Map whose square is the identity.
As , its eigenvalues are . The decomposition above is the the eigenspace decomposition for : is the -eigenspace, the -eigenspace.
If has character , then
and
If has eigenvalues , then diagonalise it as usual to get an eigenvector basis . Using the basis of you find that
This is
as required.
The proof for the symmetric square is similar, or use the decomposition of . ∎
One can also define spaces and for any , the latter vanishing if . The former is spanned by expressions where ‘the order doesn’t matter’, while the latter is spanned by expressions where ‘switching two vectors introduces a minus sign’.
A particular special case is . In this case, is exactly one dimensional (it is easy to see it is spanned by for any basis of , showing that the dimension is at most one, and there is an injective map to given by
which shows that the dimension is at least one).
If is any linear map, then we get a linear map by setting . Since it is a map from a one-dimensional vector space to itself, is just multiplication by some scalar. This scalar is exactly the determinant of !
Suppose that is a representation of and that , with being a basis of . Let , and let
be the matrix of in this basis. We compute the matrices of and .
The space is one-dimensional, with basis vector . Then
using that and . We see that
The space is three-dimensional with basis , and
whence the matrix of (in this basis) is