Why are Arabic numerals used so commonly, even in languages that use different alphabets?

“Arabic” numerals as found in Arabic script aren’t likely to look like the western script digits 0, 1, .. .. , 9.

This is top-of-the-head, but I’d guess that an ‘Arabic’ numeral would ltend to resemble Arabic script, while the Western numeric digits look a great deal like the Western alphabetic characters. The kinship should become obvious when putting the two alphabets and number sets side by side.

Instead, we use ‘Arabic’ notation. Romans gave numeric meanings to letters of their alphabet. I is 1, V is five, X is ten, L is fifty, C is one hundred, D is five hundred and M is one thousand. Doing math with letters got fairly complicated, so dividing 1113 by 3 to get 371 really means dividing MCXIII by III to et CCCLXXI. Roman notation could express whole numbers, but had no tools to add subtract multiply or divide.

The idea that zero could deserve its own symbol shocked set-in-their ways Europeans until, somewhere around 1,000 or 1,100 CE, Arabic notation, i.e. using base ten and just ten symbols to express any size number, finally broke through tradition.

“If God had meant man to fly, He’d have given him wings” vs “If God had know his children would adopt heathen arithmetic, He’d have started a second Flood.” – That kind of resistance took a while to overcome.

Can you imagine the inventor of the logarithm, and the early astronomers who calculated the orbits of the planets, doing that with letters and without a decimal point?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s