Sometimes an invention is so profound and significant yet apparently obvious in retrospect that it is difficult to imagine how people around the world struggled over millennia to discover it, and how slowly it was to diffuse from its points of origin into general use. Such is the case for our modern decimal system of positional notation for numbers and the notation for algebra and other fields of mathematics which permits rapid calculation and transformation of expressions. This book, written with the extensive source citations of a scholarly work yet accessible to any reader familiar with arithmetic and basic algebra, traces the often murky origins of this essential part of our intellectual heritage.

From prehistoric times humans have had the need to count things, for example, the number of sheep in a field. This could be done by establishing a one-to-one correspondence between the sheep and something else more portable such as one’s fingers (for a small flock), or pebbles kept in a sack. To determine whether a sheep was missing, just remove a pebble for each sheep and if any remained in the sack, that indicates how many are absent. At a slightly more abstract level, one could make tally marks on a piece of bark or clay tablet, one for each sheep. But all of this does not imply number as an abstraction independent of individual items of some kind or another. Ancestral humans don’t seem to have required more than the simplest notion of numbers: until the middle of the 20th century several tribes of Australian aborigines had no words for numbers in their languages at all, but counted things by making marks in the sand. Anthropologists discovered tribes in remote areas of the Americas, Pacific Islands, and Australia whose languages had no words for numbers greater than four.

With the emergence of settled human populations and the increasingly complex interactions of trade between villages and eventually cities, a more sophisticated notion of numbers was required. A merchant might need to compute how many kinds of one good to exchange for another and to keep records of his inventory of various items. The earliest known written records of numerical writing are Sumerian cuneiform clay tablets dating from around 3400 B.C. These tablets show number symbols formed from two distinct kinds of marks pressed into wet clay with a stylus. While the smaller numbers seem clearly evolved from tally marks, larger numbers are formed by complicated combinations of the two symbols representing numbers from 1 to 59. Larger numbers were written as groups of powers of 60 separated by spaces. This was the first known instance of a positional number system, but there is no evidence it was used for complicated calculations—just as a means of recording quantities.

Ancient civilisations: Egypt, Hebrew, Greece, China, Rome, and the Aztecs and Mayas in the Western Hemisphere all invented ways of writing numbers, some sophisticated and capable of representing large quantities. Many of these systems were *additive*: they used symbols, sometimes derived from letters in their alphabets, and composed numbers by writing symbols which summed to the total. To write the number 563, a Greek would write “φξγ”, where φ=500, ξ=60, and γ=3. By convention, numbers were written with letters in descending order of the value they represented, but the system was not positional. This made the system clumsy for representing large numbers, reusing letters with accent marks to represent thousands and an entirely different convention for ten thousands.

How did such advanced civilisations get along using number systems in which it is almost impossible to compute? Just imagine a Roman faced with multiplying MDXLIX by XLVII (1549 × 47)—where do you *start*? You don’t: all of these civilisations used some form of mechanical computational aid: an abacus, counting rods, stones in grooves, and so on to actually manipulate numbers. The Sun Zi Suan Jing, dating from fifth century China, provides instructions (algorithms) for multiplication, division, and square and cube root extraction using bamboo counting sticks (or written symbols representing them). The result of the computation was then written using the numerals of the language. The written language was thus a way to represent numbers, but not compute with them.

Many of the various forms of numbers and especially computational tools such as the abacus came ever-so-close to stumbling on the place value system, but it was in India, probably before the third century B.C. that a positional decimal number system including zero as a place holder, with digit forms recognisably ancestral to those we use today emerged. This was a breakthrough in two regards. Now, by memorising tables of addition, subtraction, multiplication, and division and simple algorithms once learned by schoolchildren before calculators supplanted that part of their brains, it was possible to directly compute from written numbers. (Despite this, the abacus remained in common use.) But, more profoundly, this was a *universal* representation of whole numbers. Earlier number systems (with the possible exception of that invented by Archimedes in The Sand Reckoner [but never used practically]) either had a limit on the largest number they could represent or required cumbersome and/or lengthy conventions for large numbers. The Indian number system needed only ten symbols to represent *any* non-negative number, and only the single convention that each digit in a number represented how many of that power of ten depending on its position.

Knowledge diffused slowly in antiquity, and despite India being on active trade routes, it was not until the 13th century A.D. that Fibonacci introduced the new number system, which had been transmitted via Islamic scholars writing in Arabic, to Europe in his Liber Abaci. This book not only introduced the new number system, it provided instructions for a variety of practical computations and applications to higher mathematics. As revolutionary as this book was, in an era of hand-copied manuscripts, its influence spread very slowly, and it was not until the 16th century that the new numbers became almost universally used. The author describes this protracted process, about which a great deal of controversy remains to the present day.

Just as the decimal positional number system was becoming established in Europe, another revolution in notation began which would transform mathematics, how it was done, and our understanding of the meaning of numbers. Algebra, as we now understand it, was known in antiquity, but it was expressed in a rhetorical way—in words. For example, proposition 7 of book 2 of Euclid’s Elements states:

If a straight line be cut at random, the square of the whole is equal to the squares on the segments and twice the rectangle contained by the segments.

Now, given such a problem, Euclid or any of those following in his tradition would draw a diagram and proceed to prove from the axioms of plane geometry the correctness of the statement. But it isn’t obvious how to apply this identity to other problems, or how it illustrates the behaviour of general numbers. Today, we’d express the problem and proceed as follows:

Once again, faced with the word problem, it’s difficult to know where to begin, but once expressed in symbolic form, it can be solved by applying rules of algebra which many master before reaching high school. Indeed, the process of simplifying such an equation is so mechanical that computer tools are readily available to do so.

Or consider the following brain-twister posed in the 7th century A.D. about the Greek mathematician and father of algebra Diophantus: how many years did he live?

“Here lies Diophantus,” the wonder behold.

Through art algebraic, the stone tells how old;

“God gave him his boyhood one-sixth of his life,

One twelfth more as youth while whiskers grew rife;

And then one-seventh ere marriage begun;

In five years there came a bounding new son.

Alas, the dear child of master and sage

After attaining half the measure of his father’s life chill fate took him.

After consoling his fate by the science of numbers for four years, he ended his life.”

Oh, *go ahead*, give it a try before reading on!

Today, we’d read through the problem and write a system of two simultaneous equations, where *x* is the age of Diophantus at his death and *y* the number of years his son lived. Then:

Plug the second equation into the first, do a little algebraic symbol twiddling, and the answer, 84, pops right out. Note that not only are the rules for solving this equation the same as for any other, with a little practice it is easy to read the word problem and write down the equations ready to solve. Go back and re-read the original problem and the equations and you’ll see how straightforwardly they follow.

Once you have transformed a mass of words into symbols, they invite you to discover new ways in which they apply. What is the solution of the equation *x*+4=0? In antiquity many would have said the equation is meaningless: there is no number you can add to four to get zero. But that’s because their conception of number was too limited: negative numbers such as −4 are completely valid and obey all the laws of algebra. By admitting them, we discovered we’d overlooked half of the real numbers. What about the solution to the equation *x*² + 4 = 0? This was again considered ill-formed, or *imaginary*, since the square of any real number, positive or negative, is positive. Another leap of imagination, admitting the square root of minus one to the family of numbers, expanded the number line into the complex plane, yielding the answer 2*i* as we’d now express it, and extending our concept of number into one which is now fundamental not only in abstract mathematics but also science and engineering. And in recognising negative and complex numbers, we’d come closer to unifying algebra and geometry by bringing rotation into the family of numbers.

This book explores the groping over centuries toward a symbolic representation of mathematics which hid the specifics while revealing the commonality underlying them. As one who learned mathematics during the height of the “new math” craze, I can’t recall a time when I didn’t think of mathematics as a game of symbolic transformation of expressions which may or may not have any connection with the real world. But what one discovers in reading this book is that while this is a concept very easy to brainwash into a 7th grader, it was extraordinarily difficult for even some of the most brilliant humans ever to have lived to grasp in the first place. When Newton invented calculus, for example, he always expressed his “fluxions” as derivatives of time, and did not write of the general derivative of a function of arbitrary variables.

Also, notation is *important*. Writing something in a more expressive and easily manipulated way can reveal new insights about it. We benefit not just from the discoveries of those in the past, but from those who created the symbolic language in which we now express them.

This book is a treasure chest of information about how the language of science came to be. We encounter a host of characters along the way, not just great mathematicians and scientists, but scoundrels, master forgers, chauvinists, those who preserved precious manuscripts and those who burned them, all leading to the symbolic language in which we so effortlessly write and do mathematics today.

**Mazur, Joseph. Enlightening Symbols. Princeton: Princeton University Press, 2014. ISBN 978-0-691-15463-3.**

Wait, does that mean Europe used Roman numerals all the way up until the 16th Century?!

I had no idea.

Is it any

wonderscience didn’t take off until then?You seem to have forgotten a ‘s’ there, John. There is not one single math notation, there are probably dozens. Each branch of applied and theoretical math does things slightly differently (which is why reading papers from other disciplines is such a pain). Math as a universal language is a nice idea, but simply isn’t true in the real world.

The replacement of Roman numerals with Hindu-Arabic numerals was a gradual process, radiating outward from sources who possessed copies of Fibonacci’s manuscript and derivative works from the 14th century, then exploding in the latter half of the 16th century when printing enabled the wide distribution of books using the new numerals. Scholars who read the works of the abacists (which didn’t, at the time, mean users of the abacus, but rather those knowledgeable in methods of computation) adopted the decimal system long before merchants who used the abacus and kept their accounts in Roman numerals.

Science became quantitative and mathematical just around the time efficient computation and the beginning of algebra became common. There are probably dozens of Ph.D. theses waiting to be done elucidating the details of this in rare book vaults in libraries in Europe. (I’m serious about this. When the author examined the oldest copy of Euclid’s

Elementsin the Bodelian Library at Oxford, he had to sign a log of people who had read the manuscript. He was startled, and humbled, to discover his signature was 12 lines below that of some fellow named Isaac Newton.)I’m not sure what you’re getting at there. I wrote (your emphasis) “

notationisimportant”. Is it not? Does it matter that there are multiple notations for things? Those working in general relativity express their equations in Cartan’s differential forms, while chemists have no idea what they’re talking about and use their own language of molecular orbitals to describe the systems they model.The notation we use to describe particle collisions at the LHC has nothing to do with that used by population biologists to model genetic drift in isolated species. And yet I think that after a few beers they’d probably agree they were all using the same fundamental mathematics to model very different things.

If you read Einstein’s papers from 1905, he used substantially different notation when doing statistical mechanics, electrodynamics/mechanics, and quantum mechanics. Yet all were founded in the common heritage of Western mathematics.

I’m sorry, but I don’t buy that. Just because something has happened does not mean it is a good thing. As a normative matter, scholars in related fields should be able to read each other’s papers without having to learn a new system of math notation. It’s ridiculous and, frankly, a scandal that this isn’t the case.

The quality of scientific research has been falling for decades; do you seriously believe our modern notational Tower of Babel has nothing to do with it?

The most hideous example I ran across was something like this:

Where it turned out the equation was really:

For some reason, the first division was left out (I saw this in multiple papers, too).

I just haven’t seen this problem with scientists in adjacent fields being unable to read each others’ papers. Certainly people working in general relativity use different notation than researchers in genetics, but who cares?—they’re working on very different things. When each writes a differential equation, the other understands it perfectly.

Where is the objective evidence for this notational Tower of Babel? If you read the arXiv, researchers in a multitude of fields, from biology, physics, to quantitative finance all communicate in the same mathematical language with no difficulty.

But not, however, the field of applied engineering, with its (many) subdivisions. Computer graphics researchers do not always write papers comprehensible to, say, electrical engineers, even in cases where their two fields intersect. I find this immensely frustrating in my own research.

Engineering papers became much more readable to me once I stopped taking the math seriously and began thinking of it as more shorthand than math.

Take polynomial curves. They are clearly inferior for scientific modelling, all the pieces for their replacement have been out there for decades (starting with Euler’s original spline theory), and there are enough motivated researchers seeking alternatives that this should have been solved decades ago. And yet, when I read the research, every five-to-ten years the literature basically resets to zero.

Why is the scientific process failing on something so basic? Why can’t these researchers build on each other’s work? My own research probably represents the state of the art in this area, and yet it is so simple that if I traveled back in time and explained computers to Carl Gauss, I’m sure my method would be the first thing to pop into his head. In five minutes he probably would have thought of something even better.

The Nothing that Is: A Natural History of Zeroby Robert Kaplan is a pretty good book. It covers, among other things, just how hard many very intelligent people found it to wrap their heads around nothing for metaphysical reasons. If God created all that is, who created all that isn’t?Sorry, the poem of Diophantus’ age was wrecked by Ricochet 2.0 formatting after being posted to the Main Feed. I have reformatted it as best I can, with blank spaces between lines, since only they are preserved across a save of the post.

Timmy, just leave our posts alone!

Oh, I think I get it. The placeholder for the embedded video is empty, zero.

An interesting summery, but I’m not sure this was so “obvious”, even in retrospect.

It has??

My wife homeschooled our kids and used the Singapore Math series through the elementary grades . The work books were full of word questions like Sam had twice as many ducks as Harvey who had 3 more chickens than Sally who had 5 less geese than Sam if Sally gave Harvey half her geese to Harvey Harvey would have 9 fowl how many geese did Sam have? (not a real problem) Some of these problems stumped my wife and I. Although we could easily solve them with algebraic notation and simultaneous equations, knowledge of algebra was not expected. Instead, the problem was usually solved by using diagrams of blocks. It just emphasized how using notation can take a concrete problem and turn it into abstraction to allow easy manipulation.

In a similar vein, I tried to help to help my middle daughter get through Calculus III last semester via phone calls and texted photos of her homework assignments. The notation of her textbook was different enough from what I had learned 40 years ago that I couldn’t even explain to her what the question was asking. I ended up buying a used copy of the textbook for myself just so we could talk the same language.

For those who haven’t done algebra since high school and may be a tad rusty, here’s how you get the answer out of the Diophantus problem in the main post. I’ll write the equations in-line rather than typesetting them with LaTeX because this is just a comment and I’m lazy.

We start with the two simultaneous equations:

x= (1/6 + 1/12 + 1/7)x+ 5 +y+ 4y=x/ 2Substitute for

yin the first equation.x= (1/6 + 1/12 + 1/7)x+ 5 +x/2 + 4Now, we want to find

xwhich satisfies this equation, so subtractxfrom both sides, yielding the following linear equation which we will proceed to solve forx.0 = (1/6 + 1/12 + 1/7)

x+ 5 +x/2 + 4 −xNow make the terms in

xa bit more clear by expanding them out.0 = (1/6 + 1/12 + 1/7)

x+ 5 + (1/2)x+ 4 + (−1)xWe can then collect the terms in

xand the constant terms as follows.0 = (1/6 + 1/12 + 1/7 + 1/2 − 1)

x+ (5 + 4)Remembering how to add and reduce fractions, we get:

0 = (−3/28)

x+ 9Subtract 9 from both sides.

−9 = (-3/28)

xMultiply both sides by 28.

−252 = −3

xDivide both sides by -3.

84 =

xDiophantus thus lived 84 years. Substituting 84 for

xinto the second original equation gives y=42, the years his son lived.Now to try a fun experiment:

<script type=“text/javascript”

$$\frac{1}{\Bigl(\sqrt{\phi \sqrt{5}}-\phi\Bigr) e^{\frac25 \pi}} =

// ]]>

(Oh, great, quoting comments is now broken, at least for me.)

Gödel’s Ghost wrote:

My understanding is that in order to use MathJax under WordPress, you have to install a plug-in at the administrator level.

WordPress blocks scripts in user-submitted HTML (as it should, otherwise this would be a huge security hole), and so including the script explicitly in a post or comment doesn’t work.

Installing the plug-in is no big deal, but it’s probably too much to hope for when after months they still can’t get the date right on edited member feed posts or get alerts working correctly. I’ve just been using TeX to PNG and including the resulting images, but given how fussy image insertion is, it’s irritating if you have more than a few equations.

John, one of my favorite classes in undergraduate school was History of Mathematics.

I enjoyed this post (and the comments) very much.

Thanks.

We need a Mathematica plug in at Ricochet. I can’t do my graphs without it!!

I think that most of mankind’s truly great inventions are information based: spoken language, written language, the idea of tools, the knowledge of how to build a fire, numbers, arithmetic, geometry, printing, double-entry bookkeeping, calculus, computers. Many of these inventions are based on using “symbols” (symbolic sounds, in the case of language) to represent things or ideas.

As usual, I only understood about half of it, but thoroughly enjoyed all of it and am grateful for all you mathematically gifted writers/readers/doers who make things work!

Great post, John!