Thursday, December 3, 2015

Matrices, Products, Lie brackets as tensors

One good thing about tensors is that they give us a way to unify matrices and vectors. What's a matrix? It's a linear map that takes in a vector in \(V\) and spits out a vector in \(W\). An element \(V^\vee\) takes in a vector, a vector \(\textbf{w}\) in \(W\) can be viewed as taking a number \(c\) and spitting out the vector \(c\textbf{w}\). So elements of \(V^\vee \otimes W\) take in vectors of \(V\) and spit out vectors in \(W\). If we look at a general element, \(u_i^j \textbf{e}^i \otimes \textbf{f}_j\), we can look at the coefficients \(u_i^j\) as the entries in the matrix with respect to the given bases. There are \(mn\) basis elements for \(V^\vee \otimes W\), and there are \(mn\) entries in a matrix from \(V\) to \(W\). So \(V^\vee\otimes W\) is equivalent to the set of linear transformations from \(V\) to \(W\).
If we have a matrix \(R = R_i^j \textbf{e}^i\otimes \textbf{f}_j\), then the behavior on a basis vector \(\textbf{e}_k\) is to compute \(R_i^j\textbf{e}^i(\textbf{e}_k) \textbf{f}_j\). Remember that \(\textbf{e}^i(\textbf{e}_k)\) is 1 if \(i = k\) and 0 otherwise, so the previous expression becomes \(R_k^j \textbf{f}_j\). Hence for a general vector \(\textbf{v} = v^k \textbf{e}_k\), we get
$$R(\textbf{v}) = v^k R_k^j \textbf{f}_j$$ which you can check matches the usual notion of applying matrices to vectors. Note: \(v^k\) and \(R_k^j\) are both scalars, so it doesn't matter what order we write them in. We only need to worry about order for \(\otimes\).
Now we can talk about change of basis stuff, which makes all of this index notation useful.
Suppose that we want to change our basis of \(V\) from \(\textbf{e}_i\) to some new basis, \(\hat{\textbf{e}}_i\). In particular, suppose that we have a change-of-basis matrix \(R\) that expresses vectors \(\textbf{e}_i\) in terms of the vectors \(\hat{\textbf{e}}_i\):
$$R(\textbf{e}_i) = R_i^j\hat{\textbf{e}}_j.$$ We can also change to the corresponding dual basis, \(\hat{\textbf{e}}^j\) via \(R^{-1}\):
$$R^{-1}(\textbf{e}^j) = (R^{-1})_i^j \hat{\textbf{e}}^i.$$ Now suppose we have a linear transformation \(S\) from \(W\) to \(V\), written in terms of the \(\textbf{e}_i\) basis. So we get that
$$S(\textbf{f}_k) = S_k^i \textbf{e}_i.$$ To change that to the new basis, we rewrite \(\textbf{e}_i\) in terms of the matrix \(R\) to get
$$S(\textbf{f}_k) = S_k^i R_i^j\hat{\textbf{e}}_j = (RS)_k^j\hat{\textbf{e}}_j.$$ You can check that this is the usual way of doing change-of-basis stuff. Similarly, if we have a linear transformation \(T\) from \(V\) to \(W\), we go the other way around:
$$T(\textbf{e}_i) = T_i^k \textbf{f}_k.$$ becomes
$$T(R_i^j\hat{\textbf{e}}_j) = T_i^k \textbf{f}_k.$$ Rewriting everything gives us
$$T(\hat{\textbf{e}}_j) = T_i^k (R^{-1})_j^i \textbf{f}_k = (TR^{-1})_j^k \textbf{f}_k.$$ Finally, if we have a transformation \(P\) from \(V\) to \(V\) and we change the basis on both ends, we get
$$P(\textbf{e}_i) = P_i^k \textbf{e}_k$$ becoming
$$P(\hat{\textbf{e}}_j) = R_l^kP^l_i(R^{-1})^i_j \hat{\textbf{e}}_k = (RPR^{-1})_j^k  \hat{\textbf{e}}_k.$$ So we see that by matching indices together we can perform linear transformations.
Other familiar objects that can be written as tensors:
The dot product works well in the standard basis but can get a little wonky in other bases. We noted that there's even a group that talks about the matrices that preserve the dot product, \(O_n\). Let's get rid of that specific basis dependency by writing it in this index notation so that we can see how it transforms. The dot product takes in two vectors and spits out a number, so it lives in \(V^\vee \otimes V^\vee\) and is therefore a linear combination of tensors of the form \(\textbf{e}^i \otimes \textbf{e}^j\), with coefficients of the form \(u_{ij}\). We say that it's a symmetric tensor, in that for any pair of values \(i\) and \(j\), \(u_{ij} = u_{ji}\) regardless of what basis we're in.
If we change our basis using the matrix \(R\), we change the dual basis by \(R^{-1}\), so we end up with $$\hat{u}_{kl} = u_{ij}(R^{-1})_k^i(R^{-1})_l^j.$$ Another familiar object: the cross product. It takes two vectors and spits out a vector, so it lives in \(V^\vee \otimes V^\vee \otimes V\) where \(V = \mathbb{R}^3\). The cross product is a linear combination of tensors of the form \(\textbf{e}^i \otimes \textbf{e}^j \otimes \textbf{e}_k\) and thus has coefficients of the form \(u_{ij}^k\). It's antisymmetric in \(i\) and \(j\), in that \(u_{ij}^k = -u_{ji}^k\). When we change our basis it becomes
$$\hat{u}_{pq}^r = u_{ij}^k (R^{-1})_p^i (R^{-1})_q^j R_k^r.$$ A generalization of the cross product is the Lie bracket of a Lie algebra. Here \(V\) is the vector space of the Lie algebra, and again the bracket takes in two vectors and spits out a vector, so its coefficients have the same structure \(u_{ij}^k\) and transform the same way. The antisymmetry rule is again \(u_{ij}^k = -u_{ji}^k\). The Jacobi identity is \(u_{ij}^h u_{kl}^j = u_{jl}^h u_{ik}^j + u_{kj}^h u_{il}^j\).
Okay, that's kind of an ugly mess, but the important bit is that we can see how the coefficients of the bracket transform when we change basis.
Since we're here, we can note that the cross product also obeys the Jacobi identity, so \(\mathbb{R}^3\) is also a Lie algebra with this bracket. Which one? \(so_3(\mathbb{R})\)!

No comments:

Post a Comment