If your persona is European, before approximately 1400CE and isn’t from Spain, then the above question is not only a mathematical one, but a philosophical one as well.

To start with, those regions of Europe that were not under Moorish rule continued to use the old Roman numerals until well into the fifteenth century. Although the first recorded use of Arabic numerals (which should, more correctly, be termed Arabic-Indian numerals) was the

*Codex Vigilanus*copied by a Spanish monk in 976CE, this is an isolated case in a copied text rather than their use in arithmetic and mathematical calculations.
This continued use of Roman numerals is one of the major reasons why Arab scholars made great advances in mathematics while their Christian counterparts did relatively little. Roman numerals are very easy to use for arithmetic; adding XII and XIIII and XVI is simple - you have seven Is, which is a V and two Is left over. You now have two Vs, which is an X with nothing left over. Finally you have four Xs, so your solution is XXXXII. In fact, you never have to count higher than five, which can be done on the fingers. This made Roman numerals very popular with merchants and accountants.

The problem arises when you try to do anything more complicated than addition or subtraction. Multiplying XII by XVII, for example, is not an easy task. Western mathematics, then, was hampered by a notation system that did not lend itself to complex procedures, and without a good notation system to communicate mathematical discoveries, Western mathematicians made slow progress.

Arab mathematicians, however, had regular contact with Indian mathematicians, and began to adopt their superior calculation methods. The first text citing this is often considered to be al-Khwarizmi’s

*Hindu Art of Reckoning*, although the original document is lost and the normal source (a twelfth century Latin translation) contains many features that have been changed from the original. This text certainly describes the Indian system using 1,2,3,4,5,6, 7,8,9 and the introduction of 0 as a placeholder.
The use of a placeholder makes complex mathematics much simpler. Originally the placeholder was a dot, so the number one hundred and four would have been written as 1.4, with one hundred, no tens and four ones. For clarity’s sake, the dot grew larger and became the zero, allowing the number to be written as 104.

Although there are indications of the Arabic/Indian numerals being used in Spain during the tenth century with the Ghubar numerals, it was Leonardo of Pisa (more commonly known as Fibonacci) who was credited with introducing the numerals (including zero) in his popular text

*Liber Abaci*in 1202.
I mentioned above some of the mathematical disadvantages of using Roman numerals, but at the beginning of this article I mentioned philosophical problems as well. In fact, the bankers of Florence were forbidden to use the new Arabic numerals in an edict of 1299 because zero was a theologically uncomfortable number. How could you represent nothingness? Could there even be such a thing as nothingness because, after all, God was omnipresent and had created the world from nothingness - the void - so therefore nothingness could no longer exist. And if it couldn’t exit then you didn’t need a symbol for it, and working with such a symbol was proclaiming that there were things over which God had no control. This was, of course, heresy.

In fact, the bankers of Florence secretly continued to use the new numerals to perform their calculations, and the Arabic word for the numeral zero -

*sifr*- became the root of the modern English word*cipher*as a result.**References**