This is true. Pi would be hard to represent but also, as you said, you could assign a separate variable for it.
Though that would be unnecessary. Like I said, you would still use floating point operations where it is needed. They don't break the CPU. I bet that is how must games are using it now and they are still running.
I'm just saying that devs should utilize integers unless it is absolutly necessary(floating points would give all around better results situation) since the PPC750 architecture seems to do so much better with integers than floating points. It would all depend just what was overall the most advantages given a situation. An exmaple would be situation where you could do something calculating 1 floating point number or 3 integers(1 to address the who number, one to address the decimal, 1 1 to hold the translation). If the performance was 4X better with integers then the integer route would be better in that situation, hypothetically.
Also, you could just take a different approach that would not require you to use pi at all. My favorite thing about programming, is that there are so many means to and end. This is the same thing we encountered in the GPU thread when people came in insisting that Normal Mapping was impossible on the Wii because it didn't have modern shaders when it was done many times
http://www.neogaf.com/forum/showpost.php?p=56901814&postcount=3690. All that was needed was a change in how it was performed.
I think the problem with performance on the Wii U CPU is people are simply not coding to its strengths. After all, a CPU with 1/10 the strength did this
http://www.youtube.com/watch?v=3xJXvFqhCk0. I say people a greatly underestimating the power of the Wii U CPU based on nothing more than it having a lower clock than the Cell and Xenon.