Just to be clear: it was an experimental compiler, nowhere close to a public product. And in a specific case. It's just that "real life" optimization examples are less obvious to grasp, but the benefits of having time to do automatic optimization are there. Even if it's to rewrite the code to optimize code cache invalidations in a long pipeline because of a badly predicted jump.
Many researchers are trying to design compilers that offer -O4 flag or better. Should you be able to predict the result of a function for all the inputs, for example, you can rewrite it differently...
Let's take an example, McCarthy 91 function :
Code:
let rec f x = if x > 100 then (x-10) else f(f(x+11));;
If x>100, there's no problem, the answer is immediate.
For lower values, especially big negatives ones, it's possibly very long.
But you can analyse the function, and see that for 90 <= x <= 100, f(x) = f(x+1). Thus, for those values of x, f(x) = 91.
Then, for values 79 <= x <= 89, f(x) = f(f(x+11)) = f(91) = 91.
Same for 68 <= x <= 78. Then, obviously, for any x below.
Thus, you can rewrite the function by
Code:
let rec f x = if x > 100 then (x-10) else 91;;
Which is insanely more efficient.
Even if this reasoning isn't that hard for a human, it's difficult to program it, and optimization is an insanely hard task. But sometimes, it find efficient solutions. It's just that it's something JIT compilers can't do, because they don't have time for it.