Let's look at a big class of groups called "matrix groups". In other words, groups that can be realized as sets of matrices.
Let's fix a positive integer \(n\) and consider \(n \times n\) matrices with entries in either \(\mathbb{R}\) or \(\mathbb{C}\); when I don't want to pick one in particular, I'll just write \(\mathbb{k}\). Recall what we need for a group:
1): Composition
2): Identity
3): Inverses
4): Associativity
Here composition means matrix multiplication. If we have two \(n \times n\) matrices, we can multiply them to get another \(n \times n\) matrix. Here's where the order of operations comes in. When we apply a matrix to a vector by multiplication, we multiply with the matrix on the left: \(T \textbf{v}\). So if we first apply \(T\) and then apply \(S\), we get \(S(T \textbf{v})\) = \((ST) \textbf{v}\). Hence the ordering as described last time: things applied later go on the left. It's kind of weird looking, but we're stuck with it.
What about identity? Well, we have an object nicely called the "identity matrix", which has 1s down the main diagonal and 0s everywhere else. Call this object \(I\). It satisfies \(IT = TI = T\) for any \(n \times n\) matrix \(T\), so we have an identity.
Now we need inverses. Not all matrices have inverses. Fortunately there are several ways to determine whether a matrix has an inverse. We're not going to pick one in particular, we're just going to restrict ourselves to invertible matrices for the moment. The set of invertible \(n \times n\) matrices over \(\mathbb{k}\) is often written as \(GL_n(\mathbb{k})\), where \(GL\) stands for "general linear", since matrices are linear transformations. If we've decided on what \(\mathbb{k}\) stands for, we'll often just write \(GL_n\).
Finally, matrix multiplication always obeys associativity, so we don't need to worry about that one.
\(GL(n)\) is a group, since if a matrix is invertible then so is its inverse. But there are plenty of other groups that can be realized as matrices.
Consider the cube again. Let's place it so that the center of the cube is at the origin in \(\mathbb{R}^3\) and that its edges are parallel to the coordinate axes. Then the symmetries of the cube can be written as linear transformations on all of \(\mathbb{R}^3\), since we can match basis vectors to the sides of the cube and so moving the sides moves the basis vectors. Hence we have another group of matrices. Note that all of these matrices are necessarily invertible, and hence live in \(GL_3\); we say that they form a subgroup, since they're a subset of \(GL_3\) and they're form a group.
Now consider a sphere centered at the origin. Just like the cube, we can rotate it around various axes. In fact, we can rotate it around any axis that passes through the origin and it will still occupy the same space. So we can again realize the symmetries of a sphere as a group of matrices. If we include reflections, we get another group of matrices, denoted \(O_3(\mathbb{R})\) (since the sphere is real and any matrix that takes the sphere to itself has real entries). The \(O\) indicates "orthogonal", because any linear transformation that preserves the sphere also preserves right angles between things (in fact, it preserves all angles). In fact, \(O_3(\mathbb{R})\) preserves more than just angles; since it preserves a sphere, and a sphere is defined in terms of distances, this group preserves distances as well. In general, the group \(O_n(\mathbb{R})\) is the group of \(n \times n\) matrices that preserve distance in \(n\)-dimensional space.
Preserving distances means that the rotations preserve the dot product. In other words, if \(T\) is a rotation matrix, then for vectors \(\textbf{u}\) and \(\textbf{v}\), we get that
$$T\textbf{u} \cdot T\textbf{v} = \textbf{u} \cdot \textbf{v}$$ If we view our vectors as matrices \([u]\) and \([v]\), we get that we can write the dot product as
$$\textbf{u}\cdot \textbf{v} = [u]^t [v]$$ where \(^t\) indicates matrix transposition. So our rule about preserving dot products can be written with only matrices:
$$(T[u])^t T[v] = [u]^t T^t T [v] = [u]^t [v]$$ Since this has to be true for all \(\textbf{u}\) and \(textbf{v}\), we get that the only possibility is for \(T^t T = I\). Thus we get that any real-valued matrix such that \(T^t T = I\) is in \(O(n, \mathbb{R})\). Similarly, any complex-valued matrix such that \(T^t T = I\) is in \(O(n, \mathbb{C})\).
In keeping with the promise to become less easily-visualized, let's talk about the determinant briefly. The determinant is multiplicative, in that given two \(n \times n\) matrices \(S\) and \(T\)
$$det(S)det(T) = det(ST)$$ In particular, if \(det(S) = det(T) = 1\), then \(det(ST) = 1\), and also \(S\) and \(T\) are invertible, and their inverses have determinant 1. So the set of \(n \times n\) matrices with determinant 1 forms a group, called \(SL_n\), for "special linear" group. We can also look at \(SO_n(\mathbb{k})\) which are the orthogonal matrices that also have determinant 1.
Before I give anyone the wrong idea, there are groups that actually cannot be realized as matrices. They obey the four rules, but for some reason or other there is no set of matrices such that for each element in the group there is a distinct matrix and the group composition matches the matrix multiplication. We mostly won't be concerning ourselves with such objects, but it is notable that they exist.
No comments:
Post a Comment