I had a chill day thinking about math today without any pressure whatsoever. First I figured out, calculating inductively, that the order of is . You calculate the number of -tuples of column vectors linear independent and from there derive as the number of vectors that cannot be appended if linear independence is to be preserved. A Sylow -group of that is the group of upper triangular matrices with ones on the diagonal, which has the order that we want.
I also find the proof of the first Sylow theorem much easier to understand now, the inspiration of it. I had always remembered that the Sylow -group we are looking for can be the stabilizer subgroup of some set of elements of the group where divides the order of the group. By the pigeonhole principle, there can be no more than elements in it. The part to prove that kept boggling my mind was the reverse inequality via orbits. It turns out that that can be viewed in a way that makes its logic feel much more natural than it did before, which like many a proof not understood, seems to spring out of the blue.
We wish to show that the number of times, letting be the largest th power dividing , that the order of some orbit is divided by is no more than . To do that it suffices to show that the sum of the orders of the orbits, is divided by no more than that many times. To show that is very mechanical. Write out as and divide out each element of the product on both the numerator and denominator by to the number of times divides it. With this, the denominator of the product is not a multiple of , which means the number of times divides the sum of the orders of the orbits is the number of times it divides , which is .
Following this, Brian Bi told me about this problem, starred in Artin, which means it was considered by the author to be difficult, that he was stuck on. To my great surprise, I managed to solve it under half an hour. The problem is:
Let be a proper subgroup of a finite group . Prove that the conjugate subgroups of don’t cover .
For this, I remembered the relation , where denotes the number of conjugate subgroups of , which is a special case of the orbit-stabilizer theorem, as conjugation is a group action after all. With this, given that and that conjugate subgroups share the identity, the union of them has less than elements.
I remember Jonah Sinick’s once saying that finite group theory is one of the most g-loaded parts of math. I’m not sure what his rationale is for that exactly. I’ll say that I have a taste for finite group theory though I can’t say I’m a freak at it, unlike Aschbacher, but I guess I’m not bad at it either. Sure, it requires some form of pattern recognition and abstraction visualization that is not so loaded on the prior knowledge front. Brian Bi keeps telling me about how hard finite group theory is, relative to the continuous version of group theory, the Lie groups, which I know next to nothing about at present.
Oleg Olegovich, who told me today that he had proved “some generalization of something to semi-simple groups,” but needs a bit more to earn the label of Permanent Head Damage, suggested upon my asking him what he considers as good mathematics that I look into Arnold’s classic on classical mechanics, which was first to come to mind on his response of “stuff that is geometric and springs out of classical mechanics.” I found a PDF of it online and browsed through it but did not feel it was that tasteful, perhaps because I’m been a bit immersed lately in the number theoretic and abstract algebraic side of math that intersects not with physics, though I had before an inclination towards more physicsy math. I thought of possibly learning PDEs and some physics as a byproduct of it, but I’m also worried about lack of focus. Maybe eventually I can do that casually without having to try too hard as I have done lately for number theory. At least, I have not the right combination of brainpower and interest sufficient for that in my current state of mind.