This week was tau day, 6/28, which celebrates the value \(\tau=2\pi≈6.28\). This is a day to acknowledge that maybe we chose the wrong value for one of the most important mathematical constants. This argument was originated by Bob Palais‘s article “Is Pi Wrong?” and proselytized by Michael Hardl’s Tau Manifesto, which argues that we did. And like any great manifesto, it gets a counter-manifesto. The specifics of the debate is well-covered by the links above and therein, so I won’t pick them apart, but the existence of the debate itself is a nice opportunity to reflect on we are really doing in mathematics.
We first need to consider what makes \(\pi\) special, although that topic is broad enough to warrant a library of its own. π can be defined geometrically as the ratio of the circumference of a circle to its diameter. The remarkable thing is that this ratio does not depend on the circle. \(\pi\) can also be defined in lots of other ways, such as the limits of various infinite sums and products, all of which can be proved equivalent.
\(\pi\) is also a transcendental number, meaning it is not the zero of any polynomial with integer coefficients. Most real numbers that come to mind are not transcendental (i.e., algebraic), and even if you think you have one, good luck proving it. Loosely, we can think of π as the simplest to define “weird” number.
Except for maybe \(\tau\). \(\tau=2\pi\) isn’t too impressive, but how about “\(\tau\) is the ratio of the circumference of a circle to its radius.” That may not seem simpler until you look at the definition of a circle, which is a set of points whose distance from a given point is constant.
The definition of a circle basically requires you to define the radius, which is the constant in question, and from there you are free to define \(\tau\). But \(\pi\) requires you to define the diameter first. At best, you might think you can just define the diameter as twice the radius and you are no better or worse off than when we defined tau as twice pi. But the diameter of a point set is generally defined as the maximum distance between any two points in that set and you need to prove a little theorem before you can claim that the diameter of a circle is twice the radius.
The debate then rages on. Many of the arguments in the links above look at some standard formulas related to \(\pi\) and see if they look better with \(\tau\). Some do, some don’t. One or the other will have some extra 2’s floating around.
This debate can be attractive to students who are just learning what formal mathematics is. It offers a rare hint of rebellion to the formulas and definitions they may just be coming to understand in a deep way. That’s a wonderful thing. This discussion has brought out some healthy attention to mathematics, and you can’t help learning a little mathematics by having it.
But when you advance further, you come to embrace just how arbitrary mathematical definitions are. You have a rare power in mathematics to define anything into existence at will. If I want to define \(\tau=2\pi\), it is so defined. If I want to define a banana-cow to be the product of the perimeter of a regular polygon and the perpendicular distance from its center to one of its sides, it is so defined. What matters is the theorems you then prove. Nobody will care about \(\tau\) until you give a compelling use for it, and there is none that isn’t served approximately as well by \(\pi\). Nobody will care about the banana-cow until you prove that the area of a regular polygon is one half of its banana-cow.
So how should we name things? In practice, you usually don’t give names to things until you know it is useful to do so. You want the name to be descriptive, but sometimes you need to prove a theorem about something before you really know what it is. Sometimes there’s just no concise way to capture a concept and you end up just naming it after somebody, like Euler’s constant. Banana-cow is a terrible name, but apothem isn’t too bad for the distance to the side. At least it’s based on some Greek roots (“put aside”), although not enough to deduce its definition. There’s an art to this, and a lot of it comes from reading mathematics and getting used to its conventions. We don’t always get it right before a name sticks.
Or worse, sometimes nothing sticks. One annoying inconsistency is the definition of natural numbers. Some authors define these to be the positive integers 1,2,3…, but some prefer to include 0. Because of this disagreement, we are stuck having to confirm the definition any time the set is used. And of all of the definitions, you’d think natural number could draw consensus. There is a rough tendency for pure mathematicians to exclude 0 whereas the more computation-oriented do not. That’s because it is often natural in programming to start counting at 0.
Maybe we got it wrong when we defined \(\pi\). If we went back and did it all over again, the simplicity argument I gave makes tau seem the more natural choice. But I’m not part of the tau movement. In the long list of notational atrocities in conventional mathematics, this barely registers. \(\pi\) is irrevocably burned into the culture and that’s fine.
If you ask a mathematician where he or she comes down on this, the most likely response is, “Oh, not this again.”
Bill Wood, @MathProfBill, is an associate professor of mathematics at the University of Northern Iowa.