*In 1915, the paper “Note on an Operation of the Third Grade” by Albert A. Bennett appeared in the *Annals of Mathematics.* A terse two-page note, it was largely neglected until the early 2000s…*

# Hyperoperations, Distributivity, and the Unreasonable Effectiveness of Multiplication

**Anil Venkatesh
Adelphi University**

### Iterated Operations

Everyone knows that multiplication is iterated addition in the sense that $2 \times 3 = 2 + 2 + 2$. Similarly, exponentiation is generally introduced as iterated multiplication since $2^3 = 2 \times 2 \times 2$. What’s the name for the operation of iterated exponentiation? In 1947, R.L. Goodstein proposed the term *tetration* for the binary operation

\[(a, n) \mapsto \underbrace{a^{a^{\cdot^{\cdot^{\cdot^a}}}}}_{n}\]

where the power tower is evaluated by convention from right to left. Goodstein derived this name from the Greek word for four by thinking of addition as the first operation, multiplication as the second, and exponentiation as the third. He also coined *pentation*, *hexation*, and so on for even higher operations in the hierarchy, although it is difficult to write these down without some new notation.

Three decades later, Donald Knuth introduced up-arrow notation that made Goodstein’s operations easier to conceptualize. Putting $2 \uparrow 3 = 2^3$, we let $2 \uparrow\!\uparrow 3 = 2 \uparrow (2 \uparrow 2)$ represent the tetration of 2 by 3. It can be helpful here to slightly reframe the idea of iterated operations. Instead of thinking of $2 \times 3$ as three 2’s added together, let’s think of multiplication as an inductive process: compute $2 \times 2$ and add one final 2 afterwards. By extension, we write $a \uparrow^{n} b = a \uparrow^{n-1} (a \uparrow^{n} (b-1))$. Just how big is $2 \uparrow\!\uparrow b$? The table below shows the first several tetrations of 2.

$b$ | 1 | 2 | 3 | 4 | 5 |

$2 \uparrow\!\uparrow b$ | 2 | 4 | 16 | 65,536 | $2^{65536} \approx 10^{19728}$ |

**Challenge:** Try writing down $2 \uparrow\!\uparrow 6$ in scientific notation!

The idea of hyperoperations formed by iteration is solidly motivated by the relationship of multiplication to addition. In fact, we can even identify the *successor operation* $(a, b) \mapsto a+1$ as the prior operation to addition in the hierarchy. However, a few key arithmetical properties seem to go missing once we pass beyond multiplication. The higher operations are merely right-associative and lack commutativity. They also fail to distribute over the immediately previous hyperoperation. Lastly, the hyperoperations of Goodstein lack an obvious extension to $\mathbb{R}$ or $\mathbb{C}$ and the concepts of identity and inverse. To gain insight here, we need to wind the clock back 30 years before Goodstein to the time of Albert A. Bennett.

**Challenge:** Double-check that $\uparrow^{n}$ doesn’t distribute over $\uparrow^{n-1}$ for $n\geq 1$.

### Commutative Hyperoperations

In 1915, the paper “Note on an Operation of the Third Grade” by Albert A. Bennett appeared in the *Annals of Mathematics*. A terse two-page note, it was largely neglected until the early 2000s and its contribution remains much less studied than Goodstein’s hyperoperations. In this comment thread on Math Stack Exchange, super-contributor Dave L. Renfro claims to have raised Bennett’s work from obscurity in 2001 or 2002 when he encountered it by chance while flipping through old volumes at a university library. What a find!

It seems that Bennett’s paper is the earliest known foray into the idea of operations beyond exponentiation. Bennett observed that for positive real numbers, multiplication can be rephrased in terms of addition like so: $a \times b = e^{\log(a) + \log(b)}$. He goes on to examine the operation $\star_2: (a,b) \mapsto e^{\log(a) \times \log(b)}$ which he notes is an associative, commutative binary operation on the positive real numbers. Also, we can check that $\star_2$ distributes over multiplication:

\begin{align*}

a \star_2 (b \times c) &= e^{\log(a) \log(b \times c)} \\

&= e^{\log(a)(\log(b) + \log(c))} \\

&= e^{\log(a)\log(b) + \log(a)\log(c)} \\

&= (a \star_2 b) \times (a \star_2 c).

\end{align*}

**Challenge:** Double-check that $a \star_2 b = a^{\log b} = b^{\log a}$. Then show this to a mathematical friend. If your friends are like mine, they will be surprised that this is allowed.

Let $\star_0$ represent addition and $\star_1$ represent multiplication. Bennett then defines $a \star_n b = e^{\log(a) \star_{n-1} \log(b)}$ for positive integers $n$ and whatever real numbers $a$ and $b$ can reasonably take, yielding a hierarchy of commutative hyperoperations akin to but distinct from those of Goodstein. Unlike Goodstein’s hierarchy, the hyperoperations of Bennett extend infinitely *below* addition as well as above. We can define $a \star_{-1} b = \log(e^a + e^b)$, sometimes known as the *smooth max* function for its tendency to highlight the larger of the two inputs as shown in the figure below. For any negative integer $n$, we can define $a \star_{n} b = \log(e^a \star_{n+1} e^b)$, extending the hierarchy of operations in the opposite direction too.

Contour plot of the smooth max function $x \star_{-1} y$.

**Challenge:** What is the domain of $\star_n$? For which $n$ does $\star_n$ have an identity element? Which elements have an inverse under $\star_n$?

Surprisingly, we have that $\star_n$ distributes over $\star_{n-1}$ for all $n$. The upshot of Bennett’s work is that the exponential function gives us a hierarchy of associative, commutative operations that extend the distributive relationship of multiplication over addition infinitely in both directions. This immediately raises the question: are Bennett’s operations the only commutative hyperoperations out there?

### The Question of Uniqueness

To ask whether Bennett’s hierarchy is unique is to ask a question about the set of all binary operations (on $\mathbb{R}$, $\mathbb{C}$, or suitable subset thereof). That’s an intimidatingly rich set to sift through! Thankfully, our interest in associative, commutative operations helps to winnow the search space. Let’s drill down even further by asking a simpler question: “other than multiplication, what operations distribute over addition?”

Suppose $\star$ is a candidate alternative to multiplication. It’s fair to expect that $\star$ extends to the complex numbers, so there is an abelian group $(G, \star)$ that can be embedded into $\mathbb{C}$ in some way. Let’s add one more property of $\star$ to our wishlist: differentiability. Just as the linear map $x \mapsto 2x$ is differentiable with a derivative of $2$, we want the map $x \mapsto 2 \star x$ to be (infinitely) differentiable. When a group law is differentiable like this, it turns the group into a smooth geometric object called a *Lie group*. Since we want $\star$ to be commutative, our Lie group $(G, \star)$ is abelian.

At this point, we’ve piled on so many different properties of $\star$ that a special situation arises: while Lie groups are intricate, subtle objects in general, *abelian* Lie groups are all built out of addition in a certain sense. In our case, what this means is that there’s a *surjective* group homomorphism $f: (\mathbb{C}, +) \to (G, \star)$ so that

\begin{align*}

f(a) \star f(b) &= f(a+b)\textrm{, hence} \\

a \star b &= f(f^{-1}(a) + f^{-1}(b)).

\end{align*}

Wait a minute! I only said $f$ was surjective, not bijective. So how come I get to use $f^{-1}(a)$? Here, that notation means *preimage*, i.e., the set of all complex numbers that $f$ maps onto $a$. The upshot is that $\star$ is built out of addition suitably combined with a mystery function $f$. We’ll now see that there are very few options for this mystery function.

Plot of the universal covering of $\mathbb{C} \backslash \{0\}$. Note that the helicoid is just a twisted-up copy of $\mathbb{C}$.

The function $f$ is an example of a *covering map* that relates the flat space $(\mathbb{C}, +)$ to the space $(G, \star)$. The word “covering” is a geometric metaphor for surjectivity, i.e., the fact that every point in $G$ is “covered” by a point in $\mathbb{C}$. If $\star$ is just standard multiplication, the group $G$ is $\mathbb{C} \backslash \{0\}$ and $f(z) = \exp(z)$, the familiar exponential map. The figure above shows how the complex plane can be twisted into a helicoid shape, thereby covering every point in $\mathbb{C} \backslash \{0\}$ infinitely many times. In this figure, the function $f$ is the projection that squashes the helicoid flat onto itself, like compressing a spring.

The only degree of freedom among universal covering maps $\mathbb{C} \to \mathbb{C} \backslash \{0\}$ is where to send the identity element. In the case of standard multiplication on $\mathbb{C} \backslash \{0\}$, we have $f(0) = 1$ because $f(z) = \exp(z)$. If we instead put $f(z) = \alpha \exp(z)$, this induces the group law $a \star b = \frac{1}{\alpha} ab$, which is just multiplication but with multiplicative identity $\alpha$ instead of 1.

Another option is to put $G = \mathbb{C}$. This imposes very strict limitations on $f$, however. Since $\mathbb{C}$ is its own universal covering, we aren’t allowed to twist it onto itself at all so $f$ has to be injective as well as surjective. The only functions with this property are of the form $f(z) = \alpha z + \beta$.

**Challenge:** Show that when $G = \mathbb{C}$ we must have $a \star b = a + b – \beta$ for some constant $\beta \in \mathbb{C}$. Then confirm that this definition of $\star$ fails to distribute over addition.

And that’s it! The only other option for $G$ is a torus, but this can’t be embedded into the complex plane so there’s no way to import its group law into $\mathbb{C}$. The upshot is that multiplication is the only operation that distributes over addition, at least if you care about things like associativity, commutativity, and complex-differentiability. Open questions may result if one or more of these conditions is relaxed!

**One last challenge for the road:** We’ve shown that the exponential map (more or less) uniquely encodes the distributive property of multiplication over addition. Does this imply that Bennett’s hierarchy of commutative hyperoperations is similarly unique?

### Further Reading

- Bennett’s original paper (1915)
- Goodstein’s paper (1947).
- Knuth’s up-arrow paper (1976)
- A fascinating 11-year old post on Math Stack Exchange that’s still provoking new responses