- gamertag: [none]
- user homepage:
...it could be nothing. That's right absolutly nothing.
How can there be nothing? How can something have nothing? You know, early civilizations never used the number 0, because they didn't know how to express nothing of something.
The history of the number zero:
The numeral or digit zero is used in numeral systems where the position of a digit signifies its value. Successive positions of digits have higher values, so the digit zero is used to skip a position and give appropriate value to the preceding and following digits.
By the mid second millennium BC, Babylonians had a sophisticated sexagesimal positional numeral system. The lack of a positional value (or zero) was indicated by a space between sexagesimal numerals. By 300 BC a punctuation symbol (two slanted wedges) was co-opted as a placeholder in the same Babylonian system.
The Ancient Greeks were unsure about the status of zero as a number: they asked themselves "how can 'nothing' be something?", leading to interesting philosophical and, by the Medieval period, religous arguments about the nature and existence of zero and the vacuum. The paradoxes of Zeno of Elea depend in large part on the uncertain interpretation of zero.
By 130 Ptolemy, influenced by Hipparchus and the Babylonians, had begun to use a symbol for zero (a small circle with a long overbar) within a sexagesimal system otherwise using alphabetic Greek numerals. Because it was used alone, not as just a placeholder, this Hellenistic zero was the first true zero in the Old World. In later Byzantine manuscripts of his Syntaxis Mathematica (Almagest), the Hellenistic zero had morphed into the Greek letter omicron (usually meaning 70).
But the late Olmec had already begun to use a true zero (a shell glyph) several centuries before Ptolemy in the New World (possibly by the fourth century BC but certainly by 40 BC), which became an integral part of Maya numerals. Another true zero was used in tables alongside Roman numerals by 525 (first known use by Dionysius Exiguus), but as a word, nulla meaning nothing, not as a symbol. When division produced zero as a remainder, nihil, also meaning nothing, was used. These medieval zeros were used by all future computists (calculators of Easter). An isolated use of their initial, N, was used in a table of Roman numerals by Bede or a colleague about 725, a true zero symbol.
The first decimal zero was introduced by Indian mathematicians about 300. An early study of the zero by Brahmagupta dates to 628. By this time it was already known in Cambodia, and it later spread to China and the Islamic world, from where it reached Europe in the 12th century.
The word zero (as well as cipher) comes from Arabic sifr, meaning "empty".
I think I have gone on enough tangents for one day.
Edit:
Posted by: PainedCypress
lol,I just have to say this probably the most intelligent conversation going on in the flood right now.
And it all started with a random, off-the-wall post.
[Edited on 2/8/2005 4:26:15 AM]