The ordinals also have the operation known as natural addition, denoted ⊕ (or sometimes #) which is defined by a different recursion. It is commutative and associative. Jacobsthal in 1907 defined a new multiplication by transfinitely iterating natural addition instead of ordinary addition; I'll call this "Jacobsthal multiplication" and denote it ×, as he did. He then defined an exponentiation by transfinitely iterating ×, which I'll call "Jacobsthal exponentiation", he denoted ab, but I'll denote a×b because... well, you'll see.
Finally there is natural multiplication, denoted ⊗ (or sometimes by a weird 16-pointed asterisk?), which is defined in terms of ⊕ by a different recursion. It is commutative, associative, and distributes over natural addition. We can, of course, transfinitely iterate this to get yet a third notion of exponentiation; I'll call this "semi-Jacobsthal exponentiation", and denote it a⊗b. (Hence why I'm using the notation a×b above -- if I called it ab, I wouldn't have a good parallel notation for semi-Jacobsthal exponentiation. In my own writing I've been denoting it by ab with a bar over the b instead of under it, but while that's easy in TeX I can't do that in HTML (unless I'm going to start pulling out the Unicode combining characters, and that wouldn't look right anyway if the exponent was more than one character long, as it frequently will be).)
(And yes, all these operations really are different. I'll leave finding examples to you.)
You could continue on to get additional notions of hyper, but again, this does not seem very interesting.
So what algebraic relations do these satisfy? Of course for the ordinary and natural operations these are well-known, but let's do this from scratch regardless. I won't be giving any proofs (most of them are straightforward), just sort of heuristic reasons.
I'm going to leave out relations about how 1 and 0 behave as these are all obvious and exactly what you expect (unless you expect 1+a=Sa, I guess). 0 and 1 are never problematic.
a+(b+c)=(a+b)+c. + is just S iterated transfinitely; to start with a and apply S b+c many times, is to start with A, apply S b many times, and then apply S c many times.
a(b+c)=ab+ac. · is just + iterated transfinitely; to add together b+c copies of a, is to add together b copies of a, and add to that the sum of c copies of a. Note this requires the associativity of ordinary addition, and the fact that ordinary addition -- being a transfinite iteration -- is continuous on the right.
a(bc)=(ab)c. To add together bc copies of a, is to add together b copies of a, and then add together c copies of the result. Note this requires a(b+c)=ab+ac.
ab+c=abac. To multiply together b+c copies of a, is to multiply together b copies of a, and multiply that by the product of c copies of a. Note this requires the associativity of ordinary multiplication.
abc=(ab)c. To multiply together bc copies of a, is to multiply together b copies of a, and then multiply together c copies of the result. Note this requires ab+c=abac.
Again, both these require the fact that ordinary addition and ordinary multiplication, being transfinite iterations, are both continuous on the right.
OK. So far so standard. Now for the next series.
a⊕(b⊕c)=(a⊕b)⊕c, a⊕b=b⊕a. I'm just going to take the properties of natural addition and natural multiplication as given; I'm not going to try to explain them.
This is where things get weird. One might be tempted to write down relations about a×(b+c) and a×(bc) like those above. However, while ⊕ is associative, and even commutative, it is not continuous on the right, which prevents such things from working.
That's not the weird part. The weird part is that, as Jacobsthal discovered, we
can get a relation involving this multiplication anyway:
Yes, it's algebraically nice -- but what does it *mean*? Why should this be true? What does it mean to add something together b⊕c times? Unfortunately, I can give no really satisfying answer. Jacobsthal proved it by figuring out how to compute × on Cantor normal forms; this is pretty easy, and once you have that, it's a straightforward verification. But it seems very much like a mathematical coincidence, and that bugs me. If there's a better reason for it, I don't know it. (Of course, it's been 100 years since he discovered this, so there's a good chance this has since been resolved, but if there's a good reason it's certainly not obvious.)
Now, once we have this, it is pretty easy to get the relation:
a×(b×c)=(a×b)×c. What a strange operation to be associative! But it's a straightforward consequence of the relation above.
Continuing on, Jacobsthal found relations involving his exponentiation. These are straightforward consequences of the above:
a×(b+c)=ab×ac. To multiply together b+c copies of a, is to multiply together b copies of a, and multiply that by the product of c copies of a.
a×(bc)=(a×b)×c. To multiply together bc copies of a, is to multiply together b copies of a, and then multiply together c copies of the result. This requires the above relation.
So these are some pretty natural-looking relations... except for the fact that the first one, and therefore both of them, relies on the fact that × is associative! Notice neither of my heuristic justifications makes any sense if × is not associative. And × being associative rests on the apparent mathematical coincidence of it distributing over natural addition (on one side, anyway). (They also rely on +, ·, and × all being transfinite iterations and therefore continuous on the right, but that's not surprising.)
So. Let's take a look at the third series.
a⊗(b⊕c)=(a⊗b)⊕(a⊗c), (a⊕b)⊗c=(a⊗c)⊕(b⊗c), a⊗(b⊗c)=(a⊗b)⊗c, a⊗b=b⊗a. Again, I'm just going to take these as givens.
This is where things get even weirder. Again, it's tempting to try to state a relation about a⊗(b+c) or a⊗(bc), but because ⊗ is not continuous on the right, this won't work.
But playing around a bit reveals the following surprising relation:
a⊗(b⊕c) = a⊗b⊗a⊗c
Once again, algebraically nice, but leaving the question -- why should this be true? What does it mean to multiply a together b⊕c times? Once again, I can't give a really satisfying answer. I proved this by coming up with a rule to compute a⊗b on Cantor normal forms, and then it's a straightforward verification. (I assume this is not original to me, but I was not working from anything else; I've only ever seen Jacobsthal's operations in that one paper of his, and I've never seen this operation anywhere.)
And not only that, but the earlier surprising coincidence is used in the proof of this one! Not in some sort of induction, because it wasn't done by induction, but because Jacobsthal product appears in the rule for computing a⊗b on Cantor normal forms. (Because × is iterated natural addition, and when one performs a natural multiplication, the exponents of the ω's undergo natural addition.)
Let's continue on. From this we can then deduce
a⊗(b×c)=(a⊗b)⊗c. Yes, that's Jacobsthal multiplication, not ordinary or natural multiplication! Because that's what you get when you iterate natural addition. This relies on the above relation (and the continuity of × on the right).
So these strange hybrid operations end up having some nice relations... but, it would seem, based on two mathematical coincidences. Two related and very similar mathematical coincidences. One alone would be a bit suspect; two would already be suggestive; these practically scream that there must be something going on here, some better reason for this that I don't know.
But I've no idea how I might investigate that (other than ask on the internet and see if anyone knows -- I'm certainly not going to study it myself).