

Two things occurred to me reading this:
- Huge numbers are exceedingly common, but counting particles is the wrong way to find them. Combinatorics is where the real monster hunting lies. When you start calculating complex probabilities or numbers of possible arrangements of things, that’s where the fuzzy boundary between “infinite” and “really, really, really finitely big” starts to blur.
- I think looking to CompSci is the right move, but I still don’t see many folks discussing computational complexity as a real, mathematical limit. We often treat two equal statements as though theres an immediate, single-step, jump between them. But discovering the equality requires computation/calculation. Shannon shows that information and entropy are the same thing. Computation is the process by which information is created. Ultrafinitist need to show that there is a finite quantity of information, which I don’t think is true or possible.
They been doing the TPB thing on and off line for decades now. Whether there’s still creative juice left to squeeze I don’t know, but those folks are truly committed to the characters if nothing else.