Começar por 0 ou 1

Data de publicação: Feb 22, 2021 1:11:11 PM

Most programming languages, including very recent ones, are zero-based.

Why one-based?

why 1 would be a better starting index than zero? Why not 2, or 10? The answer itself is interesting because it shows a lot about the though process of the people defending the idea.

The first argument is that it's more natural, because the 1st is usually the one before all others, at least, for the majority of people...

The number-one argument is that the last index is also the size of the array...

I'm still impressed by the "quality" of the reasons I usually hear for this kind of arguments... And even more when I'm reminded that...

Why not zero-based?

... "One-based" notations are left-overs from the western culture that ignored the existence of zero for centuries, if not more.

Believe it or not, the original gregorian calendar goes from -3, -2, -1, 1, 2, 3... Try to imagine the problem it contributed to western science (for example, how many years from 1st January -2 to 1st January 2 to see than the original gregorian calendar conflicts with something as simple as substraction...).

Keeping to one-based arrays is like (well, I'll be downmodded for that... ^_^ ...), keeping to miles and yards in the 21th century...

Why Zero-based? Because it's math!

First (OOops... Sorry... I'll try again)

Zero, Zero is nothing, one is something. And some religious texts hold that "At the beginning, there was nothing". Some computer-related discussion can be as burning as religious debates, so this point is not so out of topics as it seems... ^_^

First, It's easier to work with a zero-based array and ignore its zero-th value than work with one-based array and hack around to find its zero-th value. This reason as almost as stupid as the previous, but then, the original argument in favor of one-based arrays was quite a fallacy, too.

Second, Let's remember that when dealing with numbers, chances are high you'll deal with math one moment or another, and when you deal with math, chances are good you are not in the mood for stupid hacks to get around obsolete conventions. The One-based notation plagued maths and dates for centuries, too, and by learning from our mistakes, we should strive to avoid it in future oriented sciences (including computer languages).

Third, As for computer language arrays being tied to hardware, allocate a C array of 21 integers, and move the pointer 10 indices to the right, and you'll have a natural [-10 to 10] array. This is not natural for hardware. But it is for maths. Of course, math could be obsolete, but the last time I checked, most people in the world believed it was not.

Four, As already pointed elsewhere, even for discrete position (or distances reduced to discrete values), the first index would be zero, like the floor in a building (starting at zero), the decreasing countdown (3, 2, 1, ZERO!), the ground altitude, the first pixel of an image, the temperature (zero Kelvin, for the absolute zero, or zero centigrade degrees, as water freezing temperature of 273 K). In fact, the only thing that really starts with one is the traditional way of "first, second, third, etc." iteration notation, which leads me naturally to the next point...

Five the next point (which naturally follows the previous) is that high-level containers should be accessed, not by index, but by iterators, unless the indices themselves have an intrinsic value. I'm surprised your "higher-level-language" advocate did not mention that. In the case the index itself is important, you can bet half the time you have a math-related question in mind. And thus, you'd like your container to be math-friendly, and not math-disabled like "thy olde gregorian calendar" starting at 1, and needing regurgitated hacks to make it work.

Conclusion

The argument given by your fellow programmer is a fallacy because it needlessly ties spoken/written language habits, which are, by nature, blurry, to computer languages (where you don't want your instruction blurred), and because by attributing wrongly an hardware reason to this problem, he.she hopes to convince you, as languages go higher and higher in abstraction, that the zero-based array is a thing of the past.

Zero-based arrays are zero-based because of math-related reasons. Not for hardware-related reasons.

Retirado de: https://softwareengineering.stackexchange.com/questions/110804/why-are-zero-based-arrays-the-norm

Uma explicação mais matemática:

https://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/EWD831.html