It was chosen because that's how pointer math works and nothing has happened to change that.
I can see some benefits of using 1-base, but on the whole, if 0-base has to be used anyway, I'd rather only use the necessary version so I'm not making off-by-one errors when switching languages.
Yes I understand pointer array math. The zero based indexing is great for that. It’s still a leaky abstraction to make the implementation correct. I’m also not saying it’s not my preferred way, but it’s not for any really good reason, it’s cultural.
Any fixed array range (at least starting index) is an example of worse is better, pushing the work onto the user of the language to map whatever the actual range should be into a 0-based (or 1-based) form. Different problems have different natural ranges, and you shouldn't have to write a routine that manages that for you. It results in one of several possible outcomes (when your range isn't naturally starting at whatever the language forces on you):
1. You don't help your own users, instead they have to know everywhere that they need to do `histogram[c - 'a']` to calculate the actual index (clutters the code, chance to forget something).
2. You do help your own users, but now they have to remember the function/procedure call to access it: `inc_histogram(c)`. Creating a plethora of setter/getter routines to gloss over this issue and bring performance back to straight array access.
3. You do help your own users, but they realize it's "just" an array and they can use `histogram[c-'a']` to access values (and set them directly) bypassing your API.
I can see some benefits of using 1-base, but on the whole, if 0-base has to be used anyway, I'd rather only use the necessary version so I'm not making off-by-one errors when switching languages.