I don't really agree with that. I wouldn't characterize exponentiation as 'advanced math'. Maybe it is more advanced than division,
You can drop the 'maybe'. Exponentiation is more advanced than division, and obviously so. Anyway, that was not my justification for the syntax.
but I'm not sure if differentiating between something you learn in elementary school vs something you learn in middle school is really germane to the question of whether a programmer is going to be accustomed to the symbol having a certain meaning.
It is not just a matter of when you learn them, but how strongly a symbol is associated to a particular thing. '/' is familiar and intuitive to pretty much everyone since it's
standard math notation - if you have been taught division and fractions at all, then you have drawn that dividing line on paper countless times, whether horizontal or slanted. This is not even in the same ballpark as "well, ^ stands for exponentiation in Excel and some scientific calculator key labels". When I started programming, I would personally have found conjunction to be the most natural meaning for '^' since I had just been taught boolean algebra.
You can easily do something like "int n_to_the_tenth = n ^ 10;"
It compiles just fine, and can be just as confusing to novices as integer division. It is really a fairly common mistake.
In my experience, if they associate '^' with exponentiation, it is not strongly. In the students I've worked with, it's not a common mistake to begin with, and I have
never seen anyone repeat the mistake after being informed once that '^' is not exponentiation, but they happily keep trying to do division with 'x / y' despite x and y being integers.
I do agree that having separate operators is better than having the operation depend on the datatypes being used. That is a recipe for disaster, especially when the datatypes themselves are R-values generated from nested expressions. But that does not mean that I agree with Python's decision to redefine '/'.
If you are worried about the problem of things silently breaking, what is going to happen to Python 2 code where '/' meant integer division when it is run in a Python 3 interpreter where '/' means fractional division? That is a nightmare of code silently breaking. You really almost have to admire the incredible quixotic obstinance. If they just made '//' the fractional division operator and left '/' as integer division, they could maintain backwards compatibility. Instead they will throw tons of legacy code users under the bus just because they want the symbol to match up more closely with its use in a different domain.
And how would that guarantee backwards compatibility? You'd have to go through the code anyway to change every '/' into '//' that was intended to be division instead of integer division! (Unless you keep the meaning of '/' intact and it will also do division on floats, but that means you are still totally letting the programmer screw themselves, and are merely providing a new defensive programming syntax '//' which one has to remember to use. I'm not a Python coder, but that doesn't sound very Pythonic to me. Or a good idea in general.)
Even if I've somehow understood incorrectly and your suggested scheme would provide the full safety of the first solution, I don't find slight short-term convenience in code reuse to be enough of a reason to tolerate permanently unintuitive and tortured syntax. Using '//' for ordinary division would go against not just deeply ingrained association from math, but every other programming language that sensibly uses '/' to denote division (even if some of them have an unfortunate bug when both operands are integral).
In other words, I agree with the PEP r1chard linked:
"the cost of leaving this bug in the language will eventually outweigh the cost of fixing old code -- there is an upper bound to the amount of code to be fixed, but the amount of code that might be affected by the bug in the future is unbounded."